We are seeking a highly motivated Data Engineer for an Onsite assignment on our W2 with our Austin, Texas client.
Responsibilities include:
• Design, develop, and optimize robust ETL/ELT pipelines in SAS and Snowflake to meet business intelligence, analytics, and reporting requirements.
• Leverage Star and Snowflake schema modeling to efficiently structure scalable data warehouses.
• Manage, maintain, and enhance Data Lakes and Data Marts, ensuring streamlined data storage, reliable transformation, and rapid retrieval.
• Implement and champion best practices for data governance, security compliance, and performance tuning within Snowflake and SAS environments.
• Collaborate proactively with data analysts, business stakeholders, and technical teams to clearly define data requirements and deliver scalable, forward-thinking solutions.
• Monitor, troubleshoot, and proactively resolve data pipeline issues, ensuring the high availability, quality, and accuracy of critical business data.
• Develop automated solutions and optimization strategies to enhance efficiency, reliability, and scalability of data processing workflows.
• Partner closely with cross-functional teams to drive successful data migration efforts, cloud adoption, and strategic data modernization initiatives.
Requirements:
* At least 3 years of experience as a Data Engineer.
* Hands-on experience with SAS tools strongly preferred.
* Demonstrated capability in interacting with diverse business stakeholders to accurately gather requirements and translate complex data concepts into intuitive visualizations for informed decision-making.
* Proven ability to navigate ambiguity, proactively address gaps in data availability, and clearly communicate solutions that enable informed decision-making in rapidly evolving environments.
* Exceptional analytical, problem-solving, and interpersonal skills, experienced in managing relationships with clients and user groups under tight deadlines in fast-paced environments.
* Strong project and program management experience, including proficiency with Agile methodologies, change management principles, and successful project delivery.
* Hands-on experience with AWS Glue or similar data integration platforms (Azure Data Factory, Apache Airflow, Informatica, Talend) for building, executing, scheduling, and optimizing automated data jobs, enhancing reliability, scalability, and ease of management.
* Demonstrated expertise in creating clear, comprehensive, and accessible technical documentation and process standards, enhancing organizational knowledge retention and collaborative efficiency.
* Experience with SQL, Python, or other programming languages for data engineering workflows is strongly preferred