Introduction to Spark Operations: A Guide to Administering Spark at Enterprise Scale
Apache Spark may be the most powerful technology to hit the data world since MapReduce, but many enterprises face unique problems when trying to take advantage of it. With this practical book, system administrators will learn how to work with data science teams to configure, troubleshoot, and optimize Spark clusters at enterprise scale.
You’ll learn everything from initial setup and installation to all facets of architecture, functional testing, and memory. You’ll also get up to speed with the Spark WebUI and troubleshooting toolkit. Learn how to administer a Spark cluster on the Hadoop Distributed File System (HDFS), and discover how take advantage of this open source cluster computing framework.
The book provides:
- A quick reference for getting started and administering a Spark cluster
- Practical understanding of the Spark WebUI
- Deeper understanding of Spark’s internals to avoid common missteps