Stafford Technology

26Cbe2552D4909362D4Bc45A00848Cd4 Mobile


Job title: Data Engineers – Snowflake

#Data #Engineers #Snowflake

Company: Stafford Technology

Job description: The Data Engineer is responsible for the design and construction of data flows that assemble and refine complex data sets into usable information that support organization initiatives. They will work with architects and team leads to design and build data-focused artifacts consistent with the architecture direction and business areas to satisfy documented functional/non-functional requirements.

The scope of their work will support and drive capabilities of Business Intelligence, Operational Reporting, Enterprise Data Warehouse, Enterprise Application Integrations and partner application functions. In some cases, they may oversee/guide the development activities of fellow engineers in areas where they have subject matter expertise.

Essential Roles and Responsibilities

% of Time Spent

Essential Tasks/Duties/Responsibilities

60

Create and maintain data pipelines and associated objects / services,

20

Work with data and design teams to define solutions and support data requirements

20

Support peers and associated activities that enhance the overall productivity and capabilities of the department and team.

Skills and Qualifications

  • BS / Graduate Degree in Computer Science, Engineering, Mathematics, Statistics or related field
  • 3+ years of experience in similar technical roles (ETL, Application Development, Data Science, Big Data, Reporting)
  • Exposure and experience with document markups (JSON,XML), document data stores and REST API endpoints for data retrieval and update
  • Reasonable background with relational databases and SQL
  • Experience building and optimizing data pipelines and data sets
  • Experience with cloud technologies and associated purpose. AWS preferred, such S3, EC2, EMR, DynamoDB, Aurora, Athena, Glue, Lambda.
  • Ability to analyze data, find patterns, identify issues, and enhance and improve the integrity and quality of data and associated technical processes
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Working knowledge of message queuing, stream processing, and scalable data stores / processes
  • Experience with data warehouse and associated modeling / design (data mart, dimensions, facts)
  • Exposure and familiarity with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. (Python preferred)
  • Experience with Enterprise integration and ETL platforms/IPaaS (SnapLogic, Informatica, SSIS)
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Strong organizational and interpersonal skills
  • Protect and take care of our company and member’s data every day by committing to work within our company ethics and policies
  • MarkLogic or Snowflake experience a big plus.

Contract position

$50-70 per hour dependent upon experience

Expected salary: $50 – 70 per hour

Location: Snowflake, AZ

Job date: Fri, 03 Jun 2022 01:37:07 GMT

This entry was posted in . Bookmark the permalink.

Just shipped (10 Minutes Ago)


Recent Jobs

1 Job added on Jobs Giga every 45 seconds average.