Job Description
A client in is looking for a GCP Data Engineer to join their team. This client is currently building a new data warehouse. As a Data Engineer, you will be responsible for designing, developing, and maintaining robust and scalable framework/services/application/pipelines for processing huge volume of data. You will work closely with cross-functional teams to deliver high-quality software solutions that meet our organizational needs. Additional responsibilities are shown below:
• Building data pipelines for huge volume of data.
• Designing, implementing, and managing various ETL job execution flows.
• Experience in implementing and maintaining Data Ingestion processes.
• Writing basic to advance level of optimized queries using HQL, SQL & Spark.
• Designing, implementing, and maintaining Data Transformation jobs using most efficient tools/technologies.
• Ensure the performance, quality, and responsiveness of solutions.
• Participate in code reviews to maintain code quality.
• Utilize Git for source version control.
• Set up and maintain CI/CD pipelines.
• Troubleshoot, debug, and upgrade existing application & ETL job chains.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
• 6+ years of experience working as Data Engineer
• Experience working within GCP cloud
• 4+ years of experience with Python and PySpark APIs, specifically for Object Oriented programming
• Extensive experience working with ETL jobs design principles
• Extensive experience working with PySpark
• Solid understanding of HQL, SQL and data modelling.
• Knowledge on Unix/Linux and Shell scripting principles
• Ability to write shell scripts
• Familiarity with Git and version control systems
• Experience with Jenkins and CI/CD pipelines
• Knowledge of software development best practices and design patterns
• Bachelor’s degree in Computer Science Engineering, or a related field.
Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.