Sr. Data Engineer
Our Client’s Data Engineering is the engine that powers their data-obsessed eCommerce enterprise. They move fast, iterating quickly on big business problems. They work smart, applying technology to unlock insights and provide outsized value to its customers. They swing big, knowing our customers won’t benefit from micro optimizations. Leveraging the largest data set for products sold in their space, this team treats data as an asset and determines how to maximize its business value and extend our competitive advantage.
You will be instrumental in designing and delivering Data Warehouses, Data Lake, Self-Service Tooling, Real-time Streaming and Big Data Solutions for multiple functional areas using modern cloud technologies. You will have the chance to combine a deep knowledge of business and technical mastery to own and deliver the right solution for the right business problem. Most importantly, you will have the opportunity to move fast, adapt quickly, and leave a lasting mark through the new solutions you deliver!
What You’ll Do
- Own the technical architecture, design, and implementation of big data platforms and self-service solutions
- Collaborate with stakeholders and other engineering leaders to define and develop the data architecture roadmap.
- Act as a subject matter expert to leadership for technical guidance around solution design and best practices.
- Be a technical mentor to junior engineers and expose the team to new opportunities, while still being able to dependably deliver on ongoing goals.
- Keep current on big data and data visualization technology trends, evaluate, work on proof-of-concepts and make merit-based recommendations on technologies
Who You Are
- An expert at SQL-based ETL development, and experienced with Python, Java, or equivalent scripting language.
- Experienced developing in cloud platforms such as Google Cloud Platform (preferred), AWS, Azure, or Snowflake at scale.
- Comfortable designing and implementing DW Architecture, OLAP technologies, and star/snowflake-schemas to enable self-service tooling.
- Experience with real-time data streaming tools like Kafka, Kinesis, Apache Storm or any similar tools.
- Experience with big data technologies like Hadoop, Spark, Cassandra, MongoDB or other open source big data tools.
- Experience architecting data solutions utilizing Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.
- Excellent communication and presentation skills, strong business acumen, critical thinking, and ability to work cross functionally through collaboration with engineering and business partners. Previous e-commerce experience is a plus.
- 3+ years of Data Architecture experience working in partnership with large data sets (5+TB highly desired)
- 2+ years of experience leading or mentoring technical teams.
- Bachelor’s or Masters in Computer Science, Computer Engineering, Statistics, or another quantitative discipline