Keynote | Technical
Friday 17th | 10:40 - 11:10 | Theatre 18
Keywords defining the session:
- Spark, tips
- Best practices
Since its apparition in 2014 Spark has become one of the main tools for Big Data development. It is widely used for batch and realtime applications, ETL jobs and for building and using machine learning models. It plays a big role in all the major Hadoop distributions and has been incorporated in many products and frameworks. I will talk about the lessons we learned from our Spark projects and I will share several Spark tips and tricks and best practices that can help you get through your Spark projects on time. My book Spark in Action contains a thorough overview of Apache Spark.