Position Title: Data Engineer
About the Role:
We're seeking a dynamic and resourceful Data Engineer to become part of our expanding team. In this role, you'll be responsible for building and maintaining robust, scalable data pipelines and infrastructure that support analytics and reporting across the organization. You'll partner with both technical and business teams to ensure data flows efficiently, systems are dependable, and our data strategy is future-ready. The right candidate thrives in a fast-paced environment, is a collaborative problem-solver, and brings a sharp, adaptable mindset to technical challenges.
What You'll Do:
*
Pipeline Engineering: Develop and maintain scalable ETL/ELT workflows to support data needs across analytics, operations, and business functions.
*
Infrastructure Design: Architect and fine-tune data platforms across cloud and hybrid environments for optimized access and performance.
*
Cross-Functional Collaboration: Partner with data scientists, analysts, engineers, and product teams to deliver reliable, actionable data solutions.
*
System Performance: Ensure systems are built with scale in mind, proactively managing performance and growth.
*
End-to-End Ownership: Manage the full data lifecycle-from ingestion through transformation and delivery-without handoffs or silos.
*
Data Quality & Compliance: Uphold data integrity, security, and traceability in line with internal and regulatory standards.
*
Innovative Problem Solving: Navigate evolving challenges with a solution-oriented mindset and flexible approach to tools and technologies.
*
Team Engagement: Contribute to a small, high-impact team where every voice counts and initiative drives progress.
What We're Looking For:
*
Education: A bachelor's degree in Computer Science, Engineering, or a related discipline (advanced degrees welcomed).
*
Professional Experience: Solid background in data engineering across data ingestion, transformation, and storage layers.
*
Technical Proficiency: Strong command of SQL and experience with at least one core technology (e.g., Python, Scala, Spark, Snowflake, AWS, etc.).
*
Adaptability: Comfortable working in changing technical landscapes and open to evolving tools and priorities.
*
Data Architecture Knowledge: Deep understanding of database systems, data modeling practices, and cloud-based infrastructure.
*
Workflow Tools: Experience with orchestration tools like Airflow, dbt, or comparable systems.
*
Nice to Have: Familiarity with business intelligence platforms such as Power BI, Tableau, or MicroStrategy.
Key Attributes:
*
Proactive and solution-focused, not afraid to take ownership.
*
Strong analytical and systems thinking skills with a practical mindset.
*
Clear communicator who can articulate technical ideas to various audiences.
*
Team player who can also thrive in independent settings.
*
Detail-oriented, reliable, and invested in the team's and company's success.
*
Agile and professional under pressure, with a flexible approach to shifting priorities.
