logo inner

Data Operations Engineer

FiberLightPlano, Texas, United StatesOnsite
This job is no longer open

Position Overview 


 We are seeking a highly skilled and motivated Data Operations Engineer to join our Information Technology team. As a Data Operations Engineer, you will play a crucial role in managing, optimizing, and ensuring the smooth operation of our organization's data infrastructure and systems. Your primary responsibility will be to oversee the data pipeline, support data ingestion, transformation, and storage processes, and collaborate with cross-functional teams to enhance data quality and availability.

The ideal candidate should have a strong background in data engineering, database technology, data storage, hands-on experience with data pipelines and ETL processes, and a passion for maintaining a robust and scalable data ecosystem.

Essential Job Functions


  • Data Pipeline Management: Design, implement, and maintain efficient data pipelines to ingest, process, and store large volumes of structured and unstructured data from various sources.
  • ETL Development: Develop and optimize ETL processes to ensure the smooth and accurate movement of data from source to destination, adhering to best practices and data governance guidelines.
  • Data Quality Assurance: Monitor data quality and perform data validation to ensure consistency, accuracy, and completeness of data across the data ecosystem.
  • Database Management: Manage and maintain databases and data warehouses, ensuring data availability, security, and integrity.
  • Performance Optimization: Identify opportunities to improve data processing performance and implement optimizations to enhance data processing speed and efficiency.
  • Troubleshooting and Issue Resolution: Investigate and resolve data-related issues, ensuring timely and effective solutions to maintain data integrity and availability.
  • Collaboration and Communication: Collaborate with data engineers, data scientists, and other cross-functional teams to understand data requirements, provide technical expertise, and support data-driven initiatives.
  • Automation and Monitoring: Implement monitoring and alerting solutions to proactively identify and address data processing issues and ensure continuous data availability.
  • Documentation: Maintain comprehensive documentation of data processes, pipelines, and configurations to facilitate knowledge sharing and support the team.
  • Data Security and Compliance: Ensure compliance with data security and privacy policies, implementing necessary measures to safeguard sensitive data.

Requirements


  • Bachelor’s or master’s degree in computer science, engineering, or equivalent working experience.
  • Equivalent Proven experience in data engineering, ETL development, and data pipeline management.
  • Proficiency in programming languages such as Python, Java, C++, PowerShell.
  • Hands-on experience with data warehousing technologies (e.g., SQL Server, PostgreSQL, MySQL / MariaDB).
  • Familiarity with big data technologies (e.g., Hadoop, …) and cloud-based data platforms (e.g., AWS, Azure, GCP).
  • Strong analytical and problem-solving skills, with an ability to troubleshoot complex data issues.
  • Knowledge of data modeling, data governance, and data quality best practices.
  • Experience with data visualization tools (e.g., Tableau, Power BI) preferred.
  • Demonstrated ability to adapt to new technologies and eagerness to stay updated with industry trends in data management and governance.
  • Excellent communication skills and ability to work effectively in a collaborative team environment.

Additional Skills/Abilities


In addition to the specific qualifications and responsibilities mentioned in the job description, the role may require base-level knowledge in various technology areas typically expected in an IT role. Some of these general technology requirements include:

  • Operating Systems: Proficiency in operating systems such as Windows, Linux, or macOS, including system administration, file management, and basic command-line operations.
  • Networking Concepts: Understanding of networking fundamentals, including IP addressing, subnetting, DNS, DHCP, routing, and network protocols (TCP/IP, HTTP, HTTPS, etc.).
  • Hardware Fundamentals: Understanding of computer hardware components, including CPUs, RAM, storage devices, and peripherals.
  • Virtualization: Understanding of virtualization technologies such as VMware or Hyper-V, including virtual machine deployment and management.
  • Cloud Computing: Basic knowledge of cloud computing platforms like AWS, Azure, or Google Cloud, and an understanding of cloud services such as storage, compute, and networking.
  • Troubleshooting Skills: Ability to diagnose and resolve technical issues, both hardware and software-related, and provide effective solutions.
  • Documentation and Communication: Strong documentation skills and the ability to communicate technical concepts clearly to both technical and non-technical stakeholders.
  • Databases: Familiarity with relational database concepts and SQL querying. Knowledge of common database management systems like MySQL, Oracle, or PostgreSQL.
  • Monitoring and Logging: Experience with monitoring tools like ScienceLogic, SolarWinds, Thousand Eyes, Nagios, Zabbix, or Prometheus, and basic logging solutions such as SysLog and technology stacks like ELK stack (Elasticsearch, Logstash, Kibana).
  • IT Security: Awareness of information security best practices, including data encryption, access controls, and vulnerability assessment.
  • Scripting and Automation: Experience with scripting languages like Python, PowerShell, PERL, or Bash to automate repetitive tasks and processes.
  • IT Service Management (ITSM): Basic knowledge of ITSM frameworks such as ITIL, including incident management, change management, and service request processes.
  • Web Technologies: Familiarity with web technologies such as HTML, CSS, and JavaScript, and an understanding of web servers (e.g., Apache, Nginx).
  • Version Control: Familiarity with version control systems like Git to manage and collaborate on code repositories.

Physical Requirements


  • Must be able to sit, stand, walk, stoop, kneel and reach.
  • Must be able to speak, write, read and understand English.
  • Must have visual acuity.
  • Must be able to lift 0-25 pounds.


This job is no longer open

Life at FiberLight

Digital transformation starts with purpose-built networks! FiberLight has been designing, building, and deploying one-of-a-kind fiber networks to ignite digital revolution for over 20 years. Today FiberLight owns over 14,000 route miles of robust fiber networks in over 44 key growth areas in Florida, Georgia, Maryland, Texas, Virginia, and Washington, D.C. All of our carrier-grade products including Ethernet, FiberLight Cloud Connect, Dedicated Internet Access, Dark Fiber, and Wavelengths are engineered to ensure business continuity. We credit our success to having the right people at the right places making the right investments.
Thrive Here & What We Value1. Collaborative Environment2. Continuous Improvement and Innovation3. CustomerCentric Approach4. Professional Development Opportunities5. Strong Leadership and Mentorship
Your tracker settings

We use cookies and similar methods to recognize visitors and remember their preferences. We also use them to measure ad campaign effectiveness, target ads and analyze site traffic. To learn more about these methods, including how to disable them, view our Cookie Policy or Privacy Policy.

By tapping `Accept`, you consent to the use of these methods by us and third parties. You can always change your tracker preferences by visiting our Cookie Policy.

logo innerThatStartupJob
Discover the best startup and their job positions, all in one place.
Copyright © 2024