Data Architect required ASAP!
Calling for a Data Architect who can understand and appreciate code to transform noisy real-world data into high-signal models that stand the test of time.
The role will be FULLY REMOTE.
Role Requirements:
- Having thoughtful discussions with Product Manager to understand Customer’s data engineering requirements.
- Breaking complex requirements into smaller tasks for execution.
- Ability to work efficiently with a solid sense for setting priorities; ability to guide your own learning and contribute to domain knowledge building
- Mentoring and guiding team members on ETL/ ELT processes in the cloud using tools like AirFlow, Glue, Stitch, Cloud Data Fusion, DataFlow.
- Ability to work at an abstract level and build consensus; ability to see from and sell to multiple viewpoints
- Designing and implementing Data Lake, Data Warehouse, and Data Marts in AWS
- Creating efficient SQL queries and understanding query execution plans for tuning queries on engines like PostgreSQL.
- Performance tuning of OLAP/ OLTP databases by creating indices, tables, and views.
- Write Python, Scala scripts for orchestration of data pipeline Technical Experience and Skill set:
- 5+ years’ experience in data engineering
- Strong Python, Scala programming ability with several hands-on projects implemented.
- Interest in leading a team of Data Engineers in future with the successful implementation of Data Pipelines on public cloud infrastructure
- Strong understanding of Data Engineering concepts including ETL, ELT, Data Lake, Data Warehousing, and Data Pipelines.
- Experience designing and implementing Data Pipelines, Data Lakes, Data Warehouses, and Data Marts that support terabytes scale data.
- A clear understanding of Database concepts like indexing, query performance optimization, views, and various types of schemas. Hands-on SQL programming experience with knowledge of windowing functions, subqueries, and various types of joins.
- Proficiency with Data Modeling experience (e.g., Relational, Dimensional, Columnar, Big Data) Proficiency with complex SQL and NoSQL experience
Skill Set Required:
- 5+ years experience in data engineering
- Strong Python, Scala programming ability with several hands-on projects implemented.
- Interest in leading a team of Data Engineers in future with the successful implementation of Data Pipelines on public cloud infrastructure
- Strong understanding of Data Engineering concepts including ETL, ELT, Data Lake, Data Warehousing, and Data Pipelines.
- Experience designing and implementing Data Pipelines, Data Lakes, Data Warehouses, and Data Marts that support terabytes scale data.
- A clear understanding of Database concepts like indexing, query performance optimization, views, and various types of schemas.Hands-on SQL programming experience with knowledge of windowing functions, subqueries, and various types of joins.
- Proficiency with Data Modeling experience (e.g., Relational, Dimensional, Columnar, Big Data)
- Proficiency with complex SQL and NoSQL experience
Please get in touch for further details on this excellent contract opportunity.