External

Senior Software Engineer - Data Products

🏢 Roku  •  📍 India

Sign up to view full application details.

Job Description

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team The Data Insights team plays a critical role in Roku’s Advertising organization, leading measurement and analytics initiatives that power decision-making across the advertising ecosystem. We develop and manage products that deliver actionable insights for advertisers while meeting the operational and analytical needs of internal teams.We work closely with Product Managers, Data Science, Ad Sales, Ads Operations, and multiple groups within Advertising Engineering to deliver high-impact solutions. Looking ahead, we are exploring AI-driven measurement capabilities to further enhance the effectiveness of advertising campaigns and strengthen internal analytics. About the Role We are seeking a highly skilled Senior Software Engineer with deep expertise in big data technologies, including Apache Spark and Apache Airflow. This hybrid position bridges software engineering and data engineering, requiring the ability to design, build, and maintain scalable systems for both application development and large-scale data processing.In this role, you will collaborate with cross-functional teams to architect and manage robust, production-grade data products that power critical analytics and measurement capabilities. You will work with technologies such as Apache Spark, Apache Airflow, Trino, Druid, Spring Boot, StarRocks, and Looker to deliver reliable, high-performance solutions.The ideal candidate is a proactive, self-motivated professional with a strong track record in building high-scale data services and a dedication to delivering exceptional results. What You’ll Be Doing Software Development Design and build APIs and backend services using Spring Boot to support data products and analytics workflows. Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering Design, develop, and maintain data pipelines and ETL workflows using Apache Spark and Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and analytics solutions using Druid, Trino, and StarRocks for large-scale datasets. Monitor, troubleshoot, and improve data systems to minimize downtime and maximize efficiency. Collaboration & Mentorship Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in software and data engineering. We’re Excited If You Have Bachelor's degree in computer science, Engineering, or a related field (or equivalent experience). 8+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow. Expertise with atleast one of the following Apache Druid, StarRocks, and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop and Kafka. Experience in AdTech, in advertising data platforms and campaign measurement. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. Experience with building Agentic AI systems to automate decision-making and enhance analytics workflows. (preferred) Our Hybrid Work Approach Roku fosters an inclusive and collaborative environment where teams work in the office Monday through Thursday. Fridays are flexible for remote work except for employees whose roles are required to be in the office five days a week or employees who are in offices with a five day in office policy. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. Accommodations Roku welcomes applicants of all backgrounds and provides reasonable accommodations and adjustments in accordance with applicable law. If you requi
View Full Description & Requirements →