Position title
Data Engineer (Snowflake/dbt) – Remote Data Hub
Description

Job Summary

InfoTech Solutions is seeking a highly skilled and motivated Data Engineer (Snowflake/dbt) to join our growing Remote Data Hub team. This role is ideal for a data professional who is passionate about building scalable data platforms, transforming raw data into actionable insights, and enabling data-driven decision-making across the organization.

As a Data Engineer, you will play a critical role in designing, developing, and maintaining modern cloud-based data pipelines using Snowflake and dbt. You will collaborate closely with data analysts, data scientists, and business stakeholders to ensure data is reliable, accessible, and optimized for analytics and reporting in a fully remote, high-performance environment.

Key Responsibilities

  • Design, build, and maintain robust data pipelines and ETL/ELT workflows using Snowflake and dbt.

  • Develop scalable data models and transformations to support analytics, BI, and advanced reporting.

  • Integrate data from multiple structured and unstructured sources (APIs, databases, SaaS tools).

  • Ensure data quality, consistency, and reliability through validation, testing, and monitoring.

  • Optimize query performance and storage in Snowflake for cost and efficiency.

  • Implement best practices for data governance, security, and access control.

  • Collaborate with analytics and product teams to translate business requirements into technical solutions.

  • Automate data workflows and orchestrations using tools such as Airflow, Prefect, or similar.

  • Document data architecture, pipelines, and processes for maintainability and scalability.

  • Troubleshoot and resolve data pipeline issues in a timely manner.

Required Skills and Qualifications

  • Strong experience with Snowflake as a cloud data warehouse.

  • Hands-on expertise with dbt (data build tool) for data transformation and modeling.

  • Proficiency in SQL for complex queries and performance tuning.

  • Experience with Python or similar programming languages for data engineering tasks.

  • Knowledge of modern data stack tools (Fivetran, Stitch, Segment, Kafka, etc.).

  • Experience working with cloud platforms such as AWS, Azure, or GCP.

  • Familiarity with version control systems (Git) and CI/CD pipelines.

  • Understanding of data warehousing concepts, star/snowflake schemas, and dimensional modeling.

  • Strong problem-solving skills and attention to detail.

Experience

  • 3–6 years of experience in Data Engineering, Analytics Engineering, or related roles.

  • Proven experience building production-grade data pipelines in a cloud environment.

  • Prior experience working in remote or distributed teams is a plus.

  • Experience supporting BI tools such as Power BI, Tableau, Looker, or similar is desirable.

Working Hours

  • This is a fully remote role.

  • Flexible working hours aligned with global teams.

  • Core overlap expected with business hours (minimum 4–5 hours overlap per day).

Knowledge, Skills, and Abilities

  • Strong analytical and logical thinking abilities.

  • Excellent communication skills to collaborate with technical and non-technical stakeholders.

  • Ability to work independently and manage priorities in a remote environment.

  • High attention to data accuracy and system reliability.

  • Adaptability to new tools, technologies, and evolving business requirements.

  • Strong documentation and knowledge-sharing mindset.

Benefits

  • Competitive salary package with performance-based incentives.

  • 100% remote work with flexible schedule.

  • Health insurance and wellness programs.

  • Paid time off, holidays, and sick leave.

  • Learning and development budget for certifications and courses.

  • Exposure to cutting-edge data technologies and global projects.

  • Career growth opportunities within a fast-growing organization.

Why Join InfoTech Solutions?

At InfoTech Solutions, we believe data is at the heart of innovation. You will be part of a forward-thinking organization that values technology, collaboration, and continuous learning. Our Remote Data Hub provides an inclusive and flexible work culture where your ideas are valued, your growth is supported, and your impact is visible.

Joining us means working on meaningful data platforms that power real business decisions while enjoying the freedom and balance of remote work.

How to Apply

Interested candidates are invited to submit their updated resume along with a brief cover letter highlighting their experience with Snowflake and dbt. Shortlisted candidates will be contacted for technical interviews and further evaluation.

Employment Type
Full-time
Job Location
Bengaluru, Bengaluru, Karnataka, NA, IN
Remote work from: IN
Base Salary
$10-$20 Per hour
Date posted
2026-02-15
Valid through
March 17, 2026
Button
APPLY NOW
Close modal window

Thank you for submitting your application. We will contact you shortly!