We know why you are reading about this opportunity. You are driven to achieve goals. You are looking to make a direct impact. You want to work in a culture where your co-workers work as part of a diverse team, communicate across departments, and have a positive attitude. If we had to guess, you are innovative with great ideas, want to bring efficiencies to processes, and are looking to grow your career. Are we right? If so, let’s talk about who we are.
Who We Are
Greenphire is a leading provider of clinical payment and communication solutions. We provide software as a service (SaaS) to reduce costs, increase participant retention, and produce quantifiable results for our clients in the clinical trial industry. Our vibrant culture focuses on four key values: All In, As a Team, For a Purpose, Solving Problems.We are a multi-year recipient of the Philadelphia Business Journal’s Best Places to Work award, and love to give shout-outs and awards to our employees.
Our For A Purpose committee champions philanthropic activities throughout the year so employees can give back to our community. We have a diversity committee that focuses on breaking down barriers, recognizing that our uniqueness is what makes us so successful!DataOps EngineerGreenphire is seeking an experienced DataOps engineer to join our Data Services team in order to manage data deliveries, improve the deployment automation as well as the overall service reliability. The DataOps engineer will own the operation and troubleshooting of several data products as well as the enhancement of our cloud-based data stack as well as the enhancement of data observability and systems monitoring. The ideal candidate will work well independently as well as be a strong team member, have a continuous desire to expand and grow their knowledge and skills, and have excellent communication skills. This is a unique opportunity to work in a high growth company crossing fintech and life science where data is a key driver for the future.
Responsibilities
- Work closely with development and data teams to deploy software releases and updates and identify and implement improvements to that process
- Setup and maintenance of source code repositories, workflows and pipelines as well as automation of CI/CD processes with the applicable DevOps toolset
- Setup of security and other scans and integrate quality test suites for the deployed applications as part of the CI/CD process
- Setup of data observability and application performance monitoring and alerting dashboards with the respective toolset, as well as automated health check of systems and data services
- Write deployment, configuration, provisioning, monitoring, and other scripts as needed in addition to the existing DevOps tools that are used
- Production support and troubleshooting of system/platform problems, incident management and root cause analysis
- Stay on top of industry best practices and trends, security and compliance requirements, and implement improvements to the existing toolset, procedures, and automation
- Collaboration and teamwork in an agile environment
Qualifications
- BS/BA in technical field, Computer Science, Mathematics or relevant work experience and/or training
- 5 or more years of work experience as a data engineer or a DevOps engineer
- Hands-on experience with source code management, CI/CD and DevOps tools such as GitLab, GitHub, Jenkins, Flux, CircleCI
- Hands-on experience with cloud computing platforms (AWS preferred) and orchestration/ containerization technologies like Docker, Kubernetes (EKS), Helm
- Hands-on experience with system administration, Linux/Unix shell scripts and networking
- Hands-on experience with both OLTP and OLAP databases and SQL (eg Postgres / MSSQL / MySQL / Redshift / Snowflake).
- Experience managing and troubleshooting AWS Glue pipelines
- Knowledge of NoSQL databases (Elasticsearch/MongoDB/DynamoDB) is a plus
- Knowledge of application performance monitoring and alerts and related tools like Splunk/Sumo Logic
- Knowledge of Infrastructure as Code and provisioning tools like Terraform, Cloudformation
- Experience with streaming platforms like Kafka, Kinesis, Flink, Spark a plus
- Knowledge of security and compliance requirements and practices, life science background a plus
- Excellent analytical, problem solving and troubleshooting skills
- Understanding of software development (Python/Java) and code coverage/test automation suites (eg Sonarqube, Selenium, GreatExpectations)
- Excellent communication and team collaboration skills as well as a good understanding of Agile methodologies