Job Description

About Ulfix

Ulfix is a growing global software factory, providing web and mobile development for more than 15 years. We are an agile and DevOps organization, appraised as CMMI V2.0 Level 3. We supply the software development market in North America (Canada, Mexico and the US). Our strength is our specific approach in software continuous delivery and rapid prototyping.

We are looking for experienced collaborators who want to be part of a fast-paced, dynamic environment, where everyone’s opinions and efforts are valued. We hire outstanding professionals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work.

Our team of 30+ professionals is distributed across our offices in Mexico City, Monterrey and Houston. Additionally, many of our team members who aren’t located near our office, work remotely. We provide a highly competitive compensation and benefits package with health insurance, paid time off, etc.

If you have the desire to be a part of an exciting, challenging and rapidly-growing software factory, and if you are passionate about data, we’d like to meet you.

Responsibilities

  • Experience in design patterns for data lake and data warehouse (snowflake), experience in API / real time solutioning experience
  • A solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
  • Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake and stored procedures.
  • Developing scripts Unix, Python etc. to do Extract, Load and Transform data Working knowledge of Teradata,MsSQLServer Provide production support for Data Warehouse issues such data load problems, transformation translation problems Translate requirements for BI and Reporting to Database design and reporting design.
  • Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.

Basic Skills Required

  •  Minimum 2 year of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
  • 3-4 years of hands on experience with building productionized data ingestion and processing pipelines using Java, Spark, Scala, Python and AWS Services
  • 2 years of hands on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as Teradata, MsSQLServer or DB2 Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
  • Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements.

Required Skills

  • Minimum 4 year of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders Experience building data ingestion pipeline using data streaming technology like Kafka, Informatica Experience in working with AWS, Azure and Google data services
  • Snowflake Engineer Responsibilities HBO Snowflake data engineers will be responsible for architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse.
  • A solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
  • Need to have professional knowledge of AWS Redshift and or Teradata.
  • Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake and snowflake stored procedures(Javascript). Developing scripts Unix, Python etc. to do Extract, Load and Transform data Working knowledge of AWS Redshift Provide production support for Data Warehouse issues such data load problems, transformation translation problems Translate requirements for BI and Reporting to Database design and reporting design Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.

 

 

 

More Details
Employment Type: Full Time
Location: [REMOTE]
Experience Required: Mid-Senior Level
Date Published: 12 Oct 2021
Share Job Opening