Job Description
•Build and maintain data pipelines that collect, store, and transform data to support analytics use cases and business outcomes.
•Implement data ingestion and transformation workflows in Microsoft Fabric, using Fabric-native capabilities such as notebooks, pipelines, and lakehouse patterns.
•Develop and operationalize data solutions across lakehouse layers (e.g., landing and standardized “Bronze” data through curated “Silver/Gold” outputs) aligned to the platform’s workspace architecture and OneLake design.
•Ensure data solutions are reliable and supportable by incorporating monitoring, issue resolution, and ongoing enhancements to pipelines and datasets.
•Collaborate across teams (engineering, analytics, product, and stakeholders) to translate data needs into scalable, reusable solutions and improved workflow efficiency.
•Support secure and appropriate use of Fabric assets by following established access and workspace practices.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
- 5-7 years of data engineering background
- ETL and strong data pipelining experience with Python and SQL
- Strong Microsoft Fabric experience for data transformation and integration purposing
- Azure cloud experience
Nice to Have Skills & Experience
- DataVault 2.0 and Erwin modeling experience
- Data domain experience
- Azure SQL experience
Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.