🔔 FCM Loaded

Staff Computer Vision Engineer — Robotics & Autonomous Systems (Perception Lead)

SKEYEBOTS

5 - 10 years

Kochi

Posted: 12/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Title: Staff Computer Vision Engineer Robotics & Autonomous Systems

Company: SKEYEBOTS PVT. LTD.

Employment Type: Full-time

Location + Work Mode: Kochi (Hybrid 4 days/week in-office)

Relocation: Must relocate to Kochi (no relocation assistance provided)

Travel: ~20% (trials, customer demos, testing)

Start Date: Immediate

Reporting To: CTO

Work Authorization: Must be legally authorized to work in India (or able to obtain authorization)

About Us

We build defense- and public-safety-focused robotics and autonomy systemscentered on AI-driven, fully autonomous multi-agent surveillance and rapid-deployment mission capabilities. Our first-generation system has been developed and field-tested, with key trials underway with the Indian Army, Border Security Force (BSF), and interest from the National Disaster Management Authority and international emergency-response and Defense stakeholders. Mission focus includes defense, security, precision response, and disaster/emergency response. (We operate from Kochi and San Francisco).


Role Summary

We are seeking an experienced Staff-Level Computer Vision Engineer to lead the design, development, and deployment of advanced perception systems for robotics platforms including autonomous drones, UAVs, and other robotic systems. This role blends algorithm research, software engineering, and real-world system integration to deliver robust vision and AI capabilities.


Key Responsibilities
  • Lead the design, development, and optimization of computer vision and deep learning algorithms for object detection, segmentation, pose estimation, tracking, SLAM/VIO, and 3D reconstruction.
  • Architect and maintain perception pipelines covering data collection, annotation, preprocessing, model training, evaluation, and inference with a focus on robustness and real-world performance.
  • Collaborate with robotics hardware, navigation, control, and software teams to integrate perception with autonomy stacks and robotic systems.
  • Provide technical leadership and mentorship to engineering teams; help shape long-term technology strategy and best practices.
  • Translate research insights into scalable, production-ready code and features deployed on aerial and mobile robotic platforms.
  • Document technical work, contribute to internal/external publications where appropriate, and drive innovation in perception technologies.
Required Qualifications
  • 6+ years of professional experience applying computer vision, perception, or deep learning in robotics, autonomous systems, UAVs, or closely related domains.
  • Advanced degree (MS or PhD) in Computer Science, Robotics, Electrical Engineering, or related technical discipline (or equivalent experience).
  • Deep expertise in computer vision fundamentals and modern deep learning methods for tasks like detection, segmentation, tracking, localization and mapping.
  • Strong programming skills in Python and C++, with experience using deep learning frameworks such as TensorFlow or PyTorch.
  • Understanding of robotics perception challenges including multi-sensor fusion (camera, IMU, LiDAR), calibration, and real-time constraints.
  • Experience with development and deployment on resource-constrained systems, such as embedded/edge hardware typical in UAVs and robotics.
  • Proven ability to lead complex engineering projects and mentor other engineers.


Preferred Qualifications
  • Hands-on experience with robotics frameworks such as ROS/ROS2 and autonomy stacks (e.g., PX4, Nav2).
  • Familiarity with simulation tools for robotics (e.g., Gazebo, AirSim) and test infrastructures for UAVs.
  • Experience optimizing models for edge inference (quantization, pruning, accelerators like NVIDIA Jetson).
  • Background in geometric vision, sensor calibration, or multi-modal perception systems.



Nice-to-Have
  • Prior work on autonomous aerial systems or autonomous ground robots.
  • Experience building perception pipelines that scale with data and across product releases.
  • Ability to publish and present technical work to research or industry audiences.


What Success Looks Like (first 90 days)
  • A clear perception roadmap with measurable latency/accuracy/reliability targets.
  • Stabilized training + evaluation pipeline with repeatable benchmarks and regression tracking.
  • A production-grade inference stack running on target edge hardware with profiling + optimization.
  • Seamless integration with autonomy/control teams for field-ready demos and trials.
  • Strong technical leadership: mentoring, design reviews, and high engineering standards.


Compensation & Benefits (Industry-Leading)
  • Industry-leading compensation (role-appropriate) + ESOPs
  • Performance-linked bonus
  • Health insurance (self)
  • Company-provided high-end workstation + Jetson/dev kits and required tools
  • Travel reimbursement as per policy


Interview Process
  1. Recruiter screen
  2. Technical interview 1
  3. Technical interview 2
  4. Culture/leadership round


How to Apply
  • Apply via LinkedIn Easy Apply


Equal Opportunity & Inclusive Hiring

We are committed to a safe, respectful, inclusive workplace. We make employment decisions based on merit and job-related criteria and do not tolerate harassment or retaliation.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.