logo inner

Software Engineer - Data Infrastructure

Berkshire GreyBedford | MassachusettsOnsite
This job is no longer open

Data Team Mission
Our mission on the Data Team is to bring value through data. We handle everything from schema design and streaming architecture to writing data management applications that enable world-class operations and monitoring of our robotic systems. We collaborate with teams across the organization, from internal machine learning groups to customer-facing teams, leveraging our expertise to help them achieve their goals.Role DescriptionAs a Data Infrastructure Software Engineer on Berkshire Grey’s Data Team, you will improve, manage, and own data systems used across the company.

You will collaborate with various internal customer teams, ensuring a high level of service for both our internal and external customers.Responsibilities

  • Develop software for data backup and management for on-premise robotic systems.
  • Design and build data pipelines to transform and transfer data to our data warehouse.
  • Advise internal customers and product teams on schema design, data APIs, and query/analysis needs.
  • Collaborate with our DevOps team to design and manage MongoDB, Kafka, and other components in Kubernetes, both in the cloud and at the edge.
  • Work with our Machine Learning team to automate and manage Machine Learning operations, enabling a high-throughput Machine Learning training and evaluation pipeline.

Background and ExperienceRequired

  • 3+ years of experience working with databases and/or data streaming platforms.
  • 3+ years of experience with an object-oriented programming language, preferably Python.
  • Experience developing in a Linux environment, using Git and GitHub, and utilizing a work tracking system such as Jira.
  • Demonstrated understanding of data schemas and basic schema design.
  • Proficiency and in-depth understanding of one or more databases, including MongoDB, Elasticsearch, and SQL databases.
  • Understanding of the differences and trade-offs between SQL and NoSQL databases.

Preferred

  • Experience using Snowflake as a data warehouse.
  • Familiarity with Apache Kafka.
  • 3+ years of experience with Python.
  • Familiarity with Docker and building Docker images.
  • Understanding of Kubernetes concepts, with experience deploying applications to Kubernetes.
  • Experience building systems with AWS and/or GCP.

6111-2401ZR

This job is no longer open

Life at Berkshire Grey

It's time to radically change the essential way we do business. At Berkshire Grey, our game-changing solutions combine AI and Robotics to automate omni-channel fulfillment for retailers, eCommerce, and logistics enterprises serving today's connected consumers. By transforming pick, pack and sort operations, our technology is a fundamental engine of change that moves you forward.
Thrive Here & What We Value- AI and Robotics Excellence- Ecommerce, Retail Replenishment, and Logistics Solutions- Advanced Pick, Pack, and Sort Technology- Inclusive Culture with Diverse Perspectives- Collaborative Work Environment- Emphasis on Quality and Efficiency- Continuous Learning and Improvement- Supportive Leadership- Customer Needs Focus- Technical Writing Proficiency
Your tracker settings

We use cookies and similar methods to recognize visitors and remember their preferences. We also use them to measure ad campaign effectiveness, target ads and analyze site traffic. To learn more about these methods, including how to disable them, view our Cookie Policy or Privacy Policy.

By tapping `Accept`, you consent to the use of these methods by us and third parties. You can always change your tracker preferences by visiting our Cookie Policy.

logo innerThatStartupJob
Discover the best startup and their job positions, all in one place.
Copyright © 2024