Day-to-day basis –Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
(Top duties the contractor will be accountable for)
1. Coding for building an ETL pipeline, Assist on migrating platform to GCP
2. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis
3. Ensure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality)
What will the contractor need to deliver in their initial 1–6 months? Deliver one data ingestion project independently
top 3 skills sets
1. Hadoop Developer
2. Knowledge about GCP and Cloud native solutions
3. Analyst in Business Intelligence
Knowledge of SQL and couple of programming language
ideal candidate -
(Someone that performed in this role previously, etc.)
Any testing in the interviews? Yes, we have a 30 min Hackerrank test followed by 2 rounds of interview (minimum)
Potential to hire full time- Yes for deserving candidates