Solutions Architect, Data Engineering & Lakehouse Lake
Snowflake
Where Data Does More. Join the Snowflake team.
Enterprises are modernizing data platforms and processes at a growing rate to meet the demands of their customers. A majority of this journey requires not just technical expertise but also the ability to drive predictability, and manage complexity. Snowflake’s Professional Services Organization offers our customers a market-leading set of technical capabilities as well as best practices for modernization based on experienced leadership. Our portfolio of modernization solutions spans data migrations, validation, application development, data sharing, lakehouse implementations, and data science.
As a Solutions Architect in our team, you will be responsible for delivering exceptional outcomes for our teams and customers during our modernization projects, with a focus on modern data architectures including lakehouses and streaming data. You will engage with customers to migrate from legacy data platforms, including those based on Cloudera, and build modern data pipelines and lakehouse solutions on Snowflake. You will act as the technical lead and expert for our customers throughout this process.
In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences. This ensures that our tooling is continuously improved based on our implementation experience. This role will report to the Director of specialized Workload Solutions team within the PS&T organization at Snowflake.
KEY RESPONSIBILITIES:
Delivery
Be well-versed in migrations of applications, code, and data onto cloud platforms - and how to lead design of the subsequent services onto Snowflake, including lakehouse architectures using Iceberg.
Have the ability to outline the architecture of Spark, Scala, and Cloudera environments.
Guide customers on architecting and building data engineering pipelines on Snowflake, including streaming data ingestion and processing.
Design and implement data ingestion and transformation pipelines using Apache NiFi. Architect and implement Iceberg based lakehouse solutions on Snowflake.
Run workshops and design sessions with stakeholders and customers.
Troubleshoot migration and implementation issues.
Create repeatable processes and documentation as a result of customer engagement. Scripting using Python and shell scripts for ETL workflow.
Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own.
Provide guidance on how to resolve customer-specific technical challenges.
Outline a testing strategy and plan.
Optimize Snowflake for performance and cost.
Design and implement solutions for streaming data ingestion and processing.
Product Strategy
Communicate requirements for capabilities on Snowpark conversion for Scala/Python and Spark based back end software modules.
Communicate requirements for design and development of back end big data frameworks for enhancements to our Snowpark platform.
Weigh in on and develop frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development (Snowconvert) and overall modernization processes.
Provide input on Snowflake's Lakehouse capabilities and integration with Iceberg.
OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE:
Bachelor's degree in a technical discipline or equivalent practical experience.
6+ years of experience in a customer-facing technical role dealing with complex, technical implementation projects and with a proven track record of delivering results with multi-party, multi-year digital transformation engagements.
Experience in Data Warehousing, Business Intelligence, application modernization, lakehouse implementations, or Cloud projects, including building realtime and batch data pipelines using Apache Spark and Apache NiFi.
Experience with implementing Apache Iceberg or other open table format(Delta,Hudi) based lakehouse solutions.
Experience with streaming data technologies and patterns.
Experience with Cloudera platform and migrations from Cloudera.
Ability to deliver outcomes for customers in a new arena and with a new product set.
Ability to learn new technology and build repeatable solutions/processes.
Ability to anticipate project roadblocks and have mitigation plans in-hand.
Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders.
Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations.
Ability and flexibility to travel to work with customers on-site as needed.
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.
Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.
How do you want to make your impact?
For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com