Job Description - Data Engineer :ZTSJP00003446 in Parsippany
Return to jobs

Data Engineer :ZTSJP00003446

Ref: US_EN_6_971649_1340249

Posted 6 days ago
Job Location
Parsippany, New Jersey
Contract Type
Contractor
Category
Engineering

Job Title - Data Engineer

Parsippany, New Jersey

Monday-Friday, 9 AM - 6 PM EST preferred, but flexible

Job Description

In your role within the Data and Analytics COE as a Data Engineer, you will implement ETL workflows that publish analytical data to an internal audience through a data repository (DDW) that follows well-established dimensional modeling principles. The COE team is focused on onboarding a number of key business processes that are important for internal stakeholders to drive more data-informed decisions. As a Data Engineer, you will be responsible for implementing ETL for a defined subset of business processes, operating in an agile environment.

Responsibilities:

• Collaborate with Product Owners on designs for new business processes and enhancements of existing processes

• Accept requirements in form of target schema (DDL) that ETL processes need to target, along with other dimensional modeling artifacts (bus matrix, business process definition, etc.)

• Collaborate with source system experts to identify and locate the required data in source systems and understand the nuances surrounding the source data

• Profile the source system data to develop a deeper understanding and identify edge cases and potential quality issues that need further discussion with source system experts

• Develop a high-level plan for ETL development and provide effort estimates

• Define plan (metadata) for each target column that defined how the column will be sourced directly from transactional systems or the logic by which it will be derived

• Develop batch ETL workflows, using a combination of Alteryx and SQL, that target a specified dimensional model schema (fact and dimension tables)

• Test and validate ETL workflows against real-world data (scope, volume, etc.)

• Monitor the execution performance of batch workflows to identify scalability issues and opportunities for improving performance

Knowledge, Skill and Ability Requirements:

*** Experience with developing ETL workflows that target dimensional models or star schemas required

*** Proficiency with implementing at least moderately-complex data transformation workflows in Alteryx required

*** Proficiency with the SQL language, including DDL, required

*** Working knowledge of key dimensional modeling concepts (Kimball methodology) required

• Experience with sourcing data from SAP and/or Salesforce a plus

• Familiarity with business processes involved in B2B transactions (direct and indirect sales, promotional activities, accruals for promotional expenses, etc.) a plus

• Experience with SQL Server Analysis Services (SSAS) a plus

• Experience with SSAS multidimensional mode and MDX expression language a plus

• Experience with Tableau or Power BI a plus

• Experience with Microsoft Azure a plus

• Excellent critical thinking and reasoning skills, particularly as applied to tracing data issues

• Passion for solving complex problems and making a difference

• Ability to drive a project and work both independently and in a team

• Ability to effectively work from home when required

• Excellent verbal and written communication skills

Qualifications:

• Bachelor of Science in Computer Science, Computer Engineering, Electrical Engineering, or a related field

• Minimum 4 years of experience in a closely related field and role

Equal Opportunity Employer/Veterans/Disabled

To read our Candidate Privacy Information Statement, which explains how we will use your information, please

The Company will consider qualified applicants with arrest and conviction records

Apply

Find your local office.

Modis has over 100 offices in the United States, Canada and Europe. With both industry and location-specific expertise, our people know their area and their labor market and can find the right position for you.

Locations