Summary/objective
The Data Operations (DataOps) Engineer plays a crucial role in designing, building, and maintaining data pipelines, storage and delivery systems to support various data-driven initiatives across our franchise network. They must also be able to contribute to accurate and timely reports and visualizations. This role is part of a dynamic team of information and technology professionals working closely with stakeholders to understand business requirements and translate them into scalable data solutions.
Essential functions
- Ensure Data Integrity and Security: Clean and organize data to ensure accuracy and reliability and manage data access permissions to maintain security and compliance.
- Transform and Visualize Data: Transform operational data into insightful reporting and analytics data and create compelling visualizations.
- Advocate Data Best Practices: Collaborate with various departments to discover and optimize data collection methods, ensure data quality, and support a data-driven culture within AFB.
- Proactive Data Correction: Identify and work with others to rectify inaccuracies in data to maintain high data quality.
- Develop Compliant Data Sets: Work with teams to develop data sets that comply with internal and external standards.
- Assist with Data Projects: Collaborate across business domains to assist with various data projects.
- Monitor Data Pipelines: Oversee data pipelines and automations to ensure smooth and efficient operations.
- Create Usable Documentation: Diagram and document schema, model, lifecycle, and lineage of various data elements.
Competencies
- Excellent multitasking abilities.
- Strong collaboration and teamwork skills.
- High level of self-driven motivation and accountability.
- Creativity in problem-solving and data visualization.
- Ability to communicate effectively with both technical and non-technical stakeholders.
Work environment
- Professional corporate and team-oriented environment
- Hybrid work schedule with at least 2 days each week in office
Physical demands
- Prolonged periods sitting at a desk and working on a computer.
- Must be able to lift up to 15 pounds at times.
Travel requirements
Required education and experience
- Degree in Computer Science, Data Science, Information Systems, or a related field plus 2+ years of experience in Data Engineering, Data Science, DevOps, or a related role, or 5+ years of equivalent work experience.
- Proficiency in MS SQL Server (e.g., Indexes, Stored Procedures, and Complex Views).
- Skilled at programming in Python, Scala, TSQL, and/or R.
- Experience with data storage and management solutions (e.g.: Data Warehouses, Data Lakes, and Data Lakehouses)
- Familiarity with ETL/ELT methodologies and tools (e.g., Azure Data Factory and Altova MapForce/FlowForce).
- Knowledge of dimensional data modeling principles and techniques.
- Understanding of efficient schema design, compatibility and evolution practices
- Expertise in Power BI for data visualization.
- Ability to deliver data into descriptive, diagnostic, predictive, and prescriptive analytics.
Preferred education and experience
- Bachelor's degree preferred.
- Preferred experience in franchising or retail environments.
- Experience with DAX and PowerQuery.
- Working knowledge of Microsoft’s Dataverse and Power Platform.