Senior Data Engineer

Zylo
Indianapolis
6-8 years
IT Infrastructure
Full-time

Job Description

Our Senior Data Engineer will be responsible for designing, implementing, and maintaining scalable data pipelines that efficiently collect, process, and store data for analysis. You will work closely with a data scientist, software engineers, analysts, and other stakeholders to ensure that the data infrastructure supports business intelligence, machine learning, and other data-driven solutions. What will you do
  • Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines to extract, transform, and load (ETL) data from various sources to data warehouses or data lakes.
  • Data Integration: Integrate data from multiple internal and external sources into centralized storage systems while ensuring data quality and consistency.
  • Database Management: Manage large datasets and databases, ensuring their security, performance, and scalability.
  • Data Modeling: Create and optimize data models to support analytics, reporting, and machine learning workloads.
  • Optimization and Performance: Continuously monitor and improve the performance, reliability, and efficiency of the data pipeline infrastructure.
  • Data Quality: Implement measures to maintain data integrity, cleanliness, and consistency across all systems.
  • Automation: Automate manual data tasks to streamline data workflows and reduce manual intervention.
  • Documentation: Document data processes, pipeline configurations, and data flow designs for team collaboration and future reference.
  What you will need:  
  • For this position, candidate must live within the Greater Indianapolis area. Zylo has a hybrid work environment with an in-office schedule 2x/week. 
  • 5+ years of experience as a Data Engineer, Software Engineer with a data focus, or a similar role with a proven track record of designing and building data pipelines at scale.
  • Strong experience in ETL processes, data modeling, and managing large datasets in cloud environments.
  • Proficiency with AWS services (e.g., EC2, S3, Athena, Glue, SageMaker, Redshift) and a solid understanding of cloud data architecture.
  • Expertise in Python and SQL, with a focus on developing scalable, maintainable code to support data transformation and processing tasks.
  • Experience with orchestration tools (e.g., Fivetran, Apache Airflow) to automate and schedule ETL/ELT workflows.
  • Solid understanding of data warehouses (Redshift, Snowflake, BigQuery) and data lakes (e.g., AWS S3) for large-scale data storage and retrieval.
  • Experience with streaming data tools like Kafka and Apache Flink to handle real-time data flows.
  • Familiarity with data governance principles, including data retention, RBAC, and security best practices.
  • Soft Skills:
    • Strong problem-solving skills with an eye for detail and data-driven decision-making.
    • Excellent communication and collaboration skills, with the ability to work seamlessly across teams (Data Scientists, Engineers, Analysts).
    • Self-motivated with the ability to prioritize and manage multiple tasks in a fast-paced, ever-changing environment.
    • Comfortable with both independent work and contributing as part of a cross-functional team.
  Nice to have  
  • Exposure to SaaS Management or Software Asset Management environments
  • Data visualization experience (e.g., Tableau, PowerBI, Looker) to communicate complex data insights 
  • Familiarity with Azure or other cloud platforms
What it’s like to work with us At Zylo, we’re committed to Growing Stronger Together by fostering a diverse and inclusive workplace. We believe that a variety of perspectives not only fuels innovation, but also allows us to better serve our diverse customer base.

Company Information

Zylo is the enterprise leader in SaaS Management, enabling companies to discover, manage, and optimize their SaaS applications. Zylo helps companies reduce costs and minimize risk by centralizing SaaS inventory, license, and renewal management. Trusted by industry leaders, Zylo’s AI-powered platform provides unmatched visibility into SaaS usage and spend. Powered by the industry’s most intelligent discovery engine, Zylo continuously uncovers hidden SaaS applications, giving companies greater control over their SaaS portfolio. With more than 30 million SaaS licenses and $34 billion in SaaS spend under management, Zylo delivers the deepest insights, backed by more data than any other provider.

Senior Data Engineer

Zylo
Indianapolis
6-8 years
IT Infrastructure
Full-time

Job Description

Our Senior Data Engineer will be responsible for designing, implementing, and maintaining scalable data pipelines that efficiently collect, process, and store data for analysis. You will work closely with a data scientist, software engineers, analysts, and other stakeholders to ensure that the data infrastructure supports business intelligence, machine learning, and other data-driven solutions. What will you do
  • Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines to extract, transform, and load (ETL) data from various sources to data warehouses or data lakes.
  • Data Integration: Integrate data from multiple internal and external sources into centralized storage systems while ensuring data quality and consistency.
  • Database Management: Manage large datasets and databases, ensuring their security, performance, and scalability.
  • Data Modeling: Create and optimize data models to support analytics, reporting, and machine learning workloads.
  • Optimization and Performance: Continuously monitor and improve the performance, reliability, and efficiency of the data pipeline infrastructure.
  • Data Quality: Implement measures to maintain data integrity, cleanliness, and consistency across all systems.
  • Automation: Automate manual data tasks to streamline data workflows and reduce manual intervention.
  • Documentation: Document data processes, pipeline configurations, and data flow designs for team collaboration and future reference.
  What you will need:  
  • For this position, candidate must live within the Greater Indianapolis area. Zylo has a hybrid work environment with an in-office schedule 2x/week. 
  • 5+ years of experience as a Data Engineer, Software Engineer with a data focus, or a similar role with a proven track record of designing and building data pipelines at scale.
  • Strong experience in ETL processes, data modeling, and managing large datasets in cloud environments.
  • Proficiency with AWS services (e.g., EC2, S3, Athena, Glue, SageMaker, Redshift) and a solid understanding of cloud data architecture.
  • Expertise in Python and SQL, with a focus on developing scalable, maintainable code to support data transformation and processing tasks.
  • Experience with orchestration tools (e.g., Fivetran, Apache Airflow) to automate and schedule ETL/ELT workflows.
  • Solid understanding of data warehouses (Redshift, Snowflake, BigQuery) and data lakes (e.g., AWS S3) for large-scale data storage and retrieval.
  • Experience with streaming data tools like Kafka and Apache Flink to handle real-time data flows.
  • Familiarity with data governance principles, including data retention, RBAC, and security best practices.
  • Soft Skills:
    • Strong problem-solving skills with an eye for detail and data-driven decision-making.
    • Excellent communication and collaboration skills, with the ability to work seamlessly across teams (Data Scientists, Engineers, Analysts).
    • Self-motivated with the ability to prioritize and manage multiple tasks in a fast-paced, ever-changing environment.
    • Comfortable with both independent work and contributing as part of a cross-functional team.
  Nice to have  
  • Exposure to SaaS Management or Software Asset Management environments
  • Data visualization experience (e.g., Tableau, PowerBI, Looker) to communicate complex data insights 
  • Familiarity with Azure or other cloud platforms
What it’s like to work with us At Zylo, we’re committed to Growing Stronger Together by fostering a diverse and inclusive workplace. We believe that a variety of perspectives not only fuels innovation, but also allows us to better serve our diverse customer base.

Company Information

Zylo is the enterprise leader in SaaS Management, enabling companies to discover, manage, and optimize their SaaS applications. Zylo helps companies reduce costs and minimize risk by centralizing SaaS inventory, license, and renewal management. Trusted by industry leaders, Zylo’s AI-powered platform provides unmatched visibility into SaaS usage and spend. Powered by the industry’s most intelligent discovery engine, Zylo continuously uncovers hidden SaaS applications, giving companies greater control over their SaaS portfolio. With more than 30 million SaaS licenses and $34 billion in SaaS spend under management, Zylo delivers the deepest insights, backed by more data than any other provider.
Search