Streamroot: building the video delivery of tomorrow
At
Streamroot, we’re working to overcome one of the biggest challenges
facing the internet today: the explosion of video traffic. Our goal is
to redesign the way we deliver video online, to create more robust,
cost-effective infrastructures, and to touch millions of internet users
by bringing quality video to every corner of the world.
With a
hybrid peer-to-peer solution to video streaming, Streamroot helps online
video platforms improve quality of service, scale to growing audiences
and cut their bandwidth costs by up to 70%.
We are market leaders
in both Europe and the US. Founded in 2013 by three engineers from
Ecole Centrale Paris, Streamroot grew from 3 to 20+ employees last year
and now has offices in Paris and New York. We took part in some of the
best startup accelerators in the world (Techstars, Numa) and now count
Dailymotion, Eurosport and other streaming giants among our customers.
As
a member of our Data & Efficiency team, you will help us understand
the huge amounts of data we have from video viewers all over the world.
You will transform our data into comprehensive representations to
provide our team with insights into our product and actionable feedback
on how to increase its performance.
Our development
environment fosters initiative, collaboration and accountability. You
will have the chance to make a difference across our team: from the core
peer-to-peer development, the backend supporting it and the
customer-facing sales and support teams.
You will take
the lead in discussions about improving our algorithms and which metrics
should be used to measure gains. You will give visibility to the team
by providing reports about each new improvement pushed to production and
may even have the opportunity to implement these improvements into our
core code!
And last but not least, you’ll enjoy
frequent trips and team events (and the special Streamroot happy hour at
the bar next door!).
This internship is a great
stepping-stone for those interested in a career in big data &
machine learning, and to a full-time position in our team.
Key responsibilities
- Work
with our Data Science Task Force to create metrics, update and clean
our datasets, and analyze our data with Druid and Superset (aka
Caravel), our data analytics pipeline.
- Understand the inner
workings of our core P2P product, and devise new ways of measuring and
improving the performance of our algorithms.
- Design weekly and monthly reports to provide actionable feedback on our product.
Requirements
- Hands on. Fast learner. Passionate. Independent. Persistent. Seeking constant self-improvement.
- Great attention to detail and ability to summarize data in a concise and analytical way.
- Proficiency in at least one programming language (JavaScript, Python, Ruby, etc.) is not required, but is a big plus.
- Strong foundations in mathematics and statistics. Experience in data-mining is a plus.
- Experience with R or Matlab is a plus.
- Fluent in English.
Benefits
Starting ASAP, for at least 5 months
Possible shift to a full-time contract.
Free coffee and fruits everyday !