About Us:
WITHIN is the world's first Performance Branding company, partnering with some of the biggest brands in the world to drive business growth through innovative marketing strategies. Our integrated operating model collapses the traditional marketing silos between creative and media, performance and brand, and across media channels. With a full suite of offerings including media, creative, SEO, Lifecycle, Retail Media, Affiliate and Influencer, we’re able to work with our brand partners in an integrated fashion, allowing us to align marketing strategies back to core business objectives. Client teams at WITHIN are trained on how to always act as a trusted business partner, acting as a fiduciary to client needs above our own.
Teams at WITHIN have the ability to work with iconic brands such as The North Face, Timberland, Movado Watches and Jose Cuervo. Everyone at WITHIN wants to grow and be challenged. It’s a collaborative place made up of small, closely knit and versatile teams that are fast and adaptive to solve problems and build systems.
Check out some of our work!
About the Role: We are seeking a motivated and experienced Data Scientist with 2-4 years of experience to join our dynamic and growing team. The ideal candidate will have a strong background in Python, SQL, data warehousing platforms such as Snowflake or Google BigQuery, ETL processes, API integrations, and dbt.
Responsibilities include but are not limited to;
- Design, develop, and maintain scalable and robust ETL pipelines using Python, SQL, and other relevant technologies.
- Work with data warehousing platforms such as Snowflake or Google BigQuery to manage, optimize, and ensure data integrity and consistency.
- Utilize dbt for data modeling and transformation to support analytics and data science initiatives.
- Integrate various data sources, including third-party APIs, into our data ecosystem.
- Collaborate with data scientists, analysts, and other stakeholders to understand data needs and implement solutions.
- Monitor and ensure performance, uptime, and scalability of data systems and processes.
- Document, test, and maintain data workflows and codebase.
- Participate in code reviews, share knowledge, and mentor junior team members.
- Stay updated with the latest trends in data engineering and continuously seek opportunities to innovate and optimize current processes.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, Business/Finance or a related field.
- 4-6 years of hands-on experience in data engineering, analytics, or a similar role.
- Proficiency in Python and SQL.
- Experience with data warehousing platforms such as Snowflake or Google BigQuery.
- Familiarity with cloud computing platforms such as AWS, Azure, or Google Cloud Platform.
- Solid understanding of ETL processes and tools (such as dbt).
- Familiarity with dbt for data modeling and transformation.
- Experience in integrating and working with APIs.
- Strong analytical and problem-solving skills.
- Effective communication skills, both written and verbal, with the ability to work in cross-functional teams.
- Strong attention to detail and a commitment to producing high-quality results.
Preferred Requirements:
- Master's degree in a related field.
- Experience with data visualization tools like Tableau, Power BI or Sigma Computing.
- Knowledge of other programming languages or tools relevant to the field.
- Understanding of data science methodologies
Our interview process includes, but is not limited to the following:
We offer a competitive salary and benefits based on ability level, including:
- Base salary DOE
- Unlimited vacation policy
- Monthly phone/internet and food stipend
- Health insurance coverage
- Professional Development Program
- Hybrid work (Bogotá, Colombia)