Meet ContextCheck: Our Open-Source Framework for LLM & RAG Testing! Check it out on Github!


Data Engineer (Snowflake)

Department:

<Big Data>


Warsaw · Cracow · Wroclaw · Bialystok · Remote

Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI consulting companies.

As a Data Engineer, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies. Here are some of the projects we are seeking talented individuals to join:

  • Development of an operational warehouse for a big automotive client supporting near real-time processing and building foundations for integrating AI into their business processes.
  • Building and development of a modern data warehouse for the US retail industry, enabling the client to make data-driven decisions and build foundations for future AI features. This role will require a consultant mindset to help and guide the client through the product’s roadmap.
  • Development and maintenance of a large platform for processing automotive data. A significant amount of data is processed in both streaming and batch modes. The technology stack includes Spark, Cloudera, Airflow, Iceberg, Python, and AWS.

Discover our perks and benefits:


  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces. 
  • Accelerate your professional growth through career pathsknowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
  • Choose from various employment options: B2B, employment contracts, or contracts of mandate.
  • Make use of 20 fully paid days off available for B2B contractors and individuals under contracts of mandate.
  • Participate in team-building events and utilize the integration budget.
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.

In this position, you will:


  • Design, build, and maintain enterprise-grade data warehouses using Snowflake.
  • Develop scalable and secure data pipelines on AWS (e.g., S3, Glue, Lambda, IAM).
  • Collaborate with business and analytics teams to translate requirements into technical solutions.
  • Build data models and dashboards in Tableau to support business decision-making.
  • Implement and maintain ETL/ELT processes for structured and semi-structured data.
  • Optimize performance, cost, and scalability of cloud-based data platforms.
  • Ensure data governance, security, and quality best practices are followed.
  • Participate in architecture decisions and recommend improvements to existing systems.

What you’ll need to succeed in this role:


  • 2+ years of experience in a Data Engineer or similar role.
  • Proficiency in Python for data processing and scripting.
  • Strong experience with Snowflake and data warehouse architecture.
  • Hands-on experience with Amazon Web Services (S3, Glue, IAM, Lambda, Redshift optional).
  • Solid knowledge of SQL and data modeling for analytics.
  • Proven experience building data pipelines and ETL/ELT workflows.
  • Experience working with large-scale, enterprise-level datasets.
  • Good understanding of data governance and cloud security principles.
  • Strong problem-solving skills and ability to work in cross-functional teams.
  • Excellent written and verbal communication skills in English.

Nice to have:


  • Experience in data visualization and reporting (preferably using Tableau).
  • Experience with CI/CD and Infrastructure as Code (e.g., Terraform, CloudFormation).
  • Knowledge of data catalog tools or metadata management solutions.
  • Familiarity with Agile methodologies and working with remote teams.



What impressed me when I joined Addepto is the positive atmosphere, open culture, and professionalism that’s everywhere around here. It is a place where everyone can freely exchange insights, test out ideas, and discuss them with experts from different backgrounds. On top of that, you can feel the dedication to constantly learning, developing, and keeping an eye on the latest technology and trends in the data realm. In my view, it’s this mix of qualities that makes Addepto a great place for anyone wanting to deepen their knowledge, and grow, both personally and professionally.


Jakub Bzdęga Senior Data Engineer – Addepto



More than work



Remote work possibility



Challenging projects



Autonomy at work



Real impact on the company



Knowledge sharing & trainings



Team-building events



High standard equipment



Medical package



Multisport



MyBenefit Cafeteria



Language classes



Referral program



EyeCare



Paid time off


We are recognized as one of the best AI, BI, and Big Data consultants


We helped multiple companies achieve their goals, but - instead of making hollow marketing claims here - we encourage you to check our Clutch scoring.