Data Engineer
Essentials
Job title: Data Engineer
Location: Bucharest
Type: CIM
Technologies
ETL, Python, SQL, Azure Data Factory
Offer
A competitive income
A flexible layout of your hybrid working week, because only you can best assess how you can efficiently offer added value to the customer
An open & transparent company culture, with room to grow and make an impact
A group of cool colleagues who challenge each other every day with only one objective: to make each other successful!
Client
Our Client works on large scale digital transformation programs in Marketing, Sales, Logistics, Customer Experience from roadmap to execution. They have clients from all industries, but they are leading the market in the automotive and service industry.
They are a team who came together from all over the world, creative, curious and always challenging the status quo. They focus on using technology to make direct business impact, aiming to help their clients’ bottom line and long-term sustainability.
They are setting up a Center of Excellence in Marketing Automation and Business Intelligence with a brand new team in Romania! This is a great opportunity to be part of the founding team and to start this new adventure.
Responsibilities
- SQL Database Management: Design, implement, and maintain SQL databases to ensure efficient data storage and retrieval. Optimize and tune SQL queries for maximum performance.
- API, Elasticsearch and Data Loading: Develop and manage data ingestion processes using APIs, Elasticsearch and other data loading techniques.
- Azure Data Factory Pipelines: Develop and manage data pipelines using Azure Data Factory.
- ETL Development and Maintenance: Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse.
- Implement data quality checks: Ensure the integrity of data throughout the ETL process.
- Ensure the reliability, scalability, and efficiency of data movement within the Azure cloud environment.
- Work in migrating our data sources into Snowflake.
- Proactive Problem Solving: Proactively identify and address data-related issues, ensuring data accuracy and consistency.
- Collaborate with other teams to understand their data requirements and provide effective solutions.
- Clearly communicate complex technical concepts to non-technical stakeholders.
- Collaborate with data scientists, analysts, and other team members to understand data needs and deliver solutions.
- Documentation: Maintain thorough documentation for all data engineering processes, ensuring knowledge transfer and best practices.
- Data Security and Compliance: Ensure all data engineering processes adhere to data security and compliance standards. Implement data encryption and access controls as needed.
- Version Control and CI/CD: Use version control systems for code management. Implement CI/CD pipelines for data engineering workflows.
Requirements
- Bachelor’s degree in Computer Science, Information Technology, or related field.
- Proven experience in SQL database design and optimization.
- Hands-on experience with Snowflake.
- Proficiency in creating and managing data pipelines using Azure Data Factory.
- Strong ETL development skills.
- Experience with data modelling.
- Excellent problem-solving and analytical skills.
- Proactive mindset with the ability to work independently and collaboratively.
- Strong communication and interpersonal skills.
Preferred Skills
- Familiarity with other cloud platforms (AWS, GCP).
- Experience with big data technologies.
- Knowledge of data warehousing concepts.
- Certifications in relevant technologies.
Apply today
If you meet the minimum requirements and are interested in applying for this position, please send your details to careers@key-talents.com with “Data Engineer”, in the subject line.