Eventbrite has the world’s largest data repository of live events, powering millions of events and hundreds of millions of ticket transactions each year in 170+ countries. Our Data Engineering teams help wrangle data to drive business growth for our event creators, attendees, and the Eventbrite leadership. As we position Eventbrite for success over the next few years, we are building critical partnerships across the business to build high quality data models and infrastructure to drive actionable insights into our product features and services.
THE TEAM
Our Data Foundry team builds and operates a reliable, scalable and secure data platform to meet Eventbrite’s needs. They partner with product-engineering teams to deliver data-powered functionality for customers, they work with cross functional stakeholders to enable visibility into business critical metrics, and they collaborate with analysts and data scientists to enhance modeling and analytics capabilities.
YOU WILL
Work with the team to design and implement features for the data platform.
Automate and optimize data pipelines and set up monitoring and feedback systems
Write and optimize ETL from/to the product and homegrown or 3rd party tools.
Drive vendor POCs for data and analytics solutions.
Translate data consumption requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses
Collaborate cross functionally to revamp data models that power business critical internal and customer-visible metrics
Advocate for data driven thinking and help elevate data literacy across the company
Mentor and learn from peers to build a cohesive culture and best practices on data at Eventbrite
THE SKILL SET
Outstanding interpersonal and communication skills
Strong analytical, problem solving and critical thinking skills
Proficiency in Python and SQL
Experience running ETL processes on large scale data sets
Familiarity with Data Engineering tools such as Spark, Hive, Hadoop, MySQL, Airflow, DynamoDB, EMR, Terraform, Kafka/Kinesis, Presto/Trino, Kubernetes etc.
Experience translating high level business questions to analytics needs and implementation
Data modeling experience and ability to partner with technical and non technical stakeholders
5+ years of experience in Software or Data Engineering, building high quality production software
Familiar with Data Science, Machine Learning, Data Analytics, and the relevant technologies that support them.
BONUS POINTS
Experience with AWS products and services
Experience architecting and/or running data infrastructure
Active Eventbrite user with a passion for live events
Skilled in various forms of data modeling on structured, semi-structured and unstructured data
Familiarity with server-side frameworks, such as Django, Express, Rails, or .Net.
Familiarity with database optimization and scaling approaches including indexing, partitioning, sharding, clustering, in memory tables, horizontal and vertical scaling.
Familiarity with managing large datasets and understanding the complexities of merging large databases, meeting security audit requirements, and implementing data retention policies.