Please log in to watch this conference skillscast.
In this talk we will share our experiences of deploying statistical algorithms to a hadoop cluster. We will discuss approaches used to scale R code from processing thousands of data points on a desktop to billions in the cloud.
YOU MAY ALSO LIKE:
- An example of a map/reduce algorithm using R and Hadoop (SkillsCast recorded in August 2013)
- Janet Gregory's Agile Testing for the Whole Team (in London on 28th - 30th October 2019)
- Jenny Martin's BDD From Start to Finish - Successful Delivery through Continuous Collaboration (in London on 4th - 6th November 2019)
- 2 Presentations: DevOps Patterns and Antipatterns & Best Practice Application Delivery (in London on 11th December 2019)
- Pipelines Done Right (SkillsCast recorded in March 2019)
- Sociotechnical Architecture: Aligning Teams and Software for Continuous Delivery (SkillsCast recorded in November 2018)
Taking Data Science to the Data Centre
Anette is a consultant for ThoughtWorks where she builds people, teams, projects and occasionally a bit of code. She has worked in a number of different countries, industries and development stacks to solve all sorts of problems, but lately it has be
Brian helps clients make the, usually difficult, transition from a traditional analyse, develop and test development model to a more rapid, repeatable and agile mode of delivery.