Data Warehouse Developer/Data Engineer – using a modern cloud toolset

Deadline 30.11.2022

We are hiring a Data Warehouse Developer/Data engineer. For the first 6-12 months, you are helping an international financial institution move from an on-premises setup to the cloud. You will be using modern tools such as Snowflake and Databricks. 

In the long run, you will create ETL processes and data models and set up data warehouses on different platforms. If you wish, there are opportunities to try other tasks/activities, such as gathering customer requirements and building visual reporting dashboards.  

We are not stuck on specific tools and will use whatever fits the purpose of each project. This means you will be able to gain hands-on experience from a broad spectrum of technologies and tools used in this business. 

Apply

Must-haves

  • At least 2 years experience in working as SQL developer / Data Warehouse developer or Data Engineer 
  • Very good SQL knowledge  
  • Experience with dimensional data modelling
  •  Very good English language skills, both written and spoken 
  • Analytical mindset 
  • Ability and willingness to communicate to customers to clarify requirements and build solutions
  • Ability and willingness to communicate to customers to clarify requirements and build solutions. 

Nice-to-haves

  • Higher education in IT/Statistics/Data Science or similar 
  • Experience with Python or R 
  • Experience working with financial data 
  • Snowflake experience 

You will be doing

In the short run

  • Writing SQL on an existing Microsoft SQL platform for an international financial organisation 
  • Helping them move to a new cloud platform using tools such as Snowflake, Databricks and Azure Data Factory 
  • Continued data warehouse development using the cloud tools 
  • Participating in a modern agile development process with other Helmes team members 

In the long run

  • Setting up ETL/ELT processes using script-based solutions such as Databricks, as well as GUI-based tools 
  • Setting up ETL/ELT processes using script-based solutions such as Databricks, as well as GUI-based tools 
  • Setting up data warehouses, both on the cloud and on-premises 
  • Creating data models 
  • Conducting meetings with customers gathering requirements and proposing solutions (if you’re interested) 
  • Creating visualisations on platforms such as Power BI (if you’re interested) 
  • We work with different customers from different domains 
  • You can have either more or less specialisation depending on your interests and the projects on hand 
  • We try to keep the variance in the projects 

Benefits

What is in it for you? 

  • Opportunity to work with interesting data sets, various clients and challenging international projects 
  • Insight into different tools on Google, AWS and Azure clouds, as well as on-premises solutions 
  • Opportunity to use scripting, Python, but also GUI-based ETL tools 
  • A large degree of autonomy and a chance to take responsibility 
  • An experienced mentor who will guide you and provide you with feedback for fast development 
  • A minimum amount of bureaucracy, being part of a small friendly, agile team 
  • No overtime work is expected! 
  • Possibility to work from home office for a large portion of your time. Coming to the office is fine too

Helmes is an international software and data services company headquartered in Estonia with clients across all of Europe.

The long-term success of Helmes is built on lasting partnerships that bring about tangible business gains for our clients. 

Deadline 30.11.2022

Please note that recruitment for this position is an ongoing process and may close early

Apply Here