Neuerscheinungen 2017Stand: 2020-02-01 |
Schnellsuche
ISBN/Stichwort/Autor
|
Herderstraße 10 10625 Berlin Tel.: 030 315 714 16 Fax 030 315 714 14 info@buchspektrum.de |
Raju Kumar Mishra
PySpark Recipes
A Problem-Solution Approach with PySpark2
1st ed. 2017. xxiii, 265 S. 35 SW-Abb., 12 Farbabb. 235 mm
Verlag/Jahr: SPRINGER, BERLIN; APRESS 2017
ISBN: 1-484-23140-6 (1484231406)
Neue ISBN: 978-1-484-23140-1 (9781484231401)
Preis und Lieferzeit: Bitte klicken
Quickly find solutions to common programming problems encountered while processing big data. Content is presented in the popular problem-solution format. Look up the programming problem that you want to solve. Read the solution. Apply the solution directly in your own code. Problem solved!
PySpark Recipes covers Hadoop and its shortcomings. The architecture of Spark, PySpark, and RDD are presented. You will learn to apply RDD to solve day-to-day big data problems. Python and NumPy are included and make it easy for new learners of PySpark to understand and adopt the model.
What You Will Learn
Understand the advanced features of PySpark2 and SparkSQL
Optimize your code
Program SparkSQL with Python
Use Spark Streaming and Spark MLlib with Python
Perform graph analysis with GraphFrames
Who This Book Is For
Data analysts, Python programmers, big data enthusiasts
Chapter 1: The era of Big Data and Hadoop Chapter Goal: Reader learns about Big data and its usefulness. Also how Hadoop and its ecosystem is beautifully able to process big data for useful informations. What are the shortcomings of Hadoop which requires another Big data processing platform. No of pages 15-20 Sub -Topics 1. Introduction to Big-Data 2. Big Data challenges and processing technology 3. Hadoop, structure and its ecosystem 4. Shortcomings of Hadoop
Chapter 2: Python, NumPy and SciPy Chapter Goal: The goal of this chapter to get reader acquainted with Python, NumPy and SciPy.
No of pages: 25-30 Sub - Topics 1. Introduction to Python 2. Python collection, String Function and Class 3. NumPy and ndarray 4. SciPy Cha pter 3: Spark : Introduction, Installation, Structure and PySpark Chapter Goal: This chapters will introduce Spark, Installation on Single machine. There after it continues with structure of Spark. Finally, PySpark is introduced. No of pages : 15-20 Sub - Topics: 1. Introduction to Spark 2. Spark installation on Ubuntu 3. Spark architecture 4. PySpark and Its architecture
Chapter 4: Resilient Distributed Dataset (RDD) Chapter Goal: Chapter deals with the core of Spark, RDD. Operation on RDD No of pages: 25-30 Sub - Topics: 1. Introduction to RDD and its characteristics 2. Transformation and Actions 2. Operations on RDD ( like map, filter, set operations and many more)
Chapter 5: The power of pairs : Paired RDD Chapter Goal: Paired RDD can help in making many complex computation easy in programming. Learners will learn paired RDD and operation on this. No of pages:15 -20 Sub - Topics: 1. Introduction to Paired RDD 2. Operation on paired RDD (mapByKey, reduceByKey ......) Chapter 6: Advance PySpark and PySpark application optimization Chapter Goal: 30-35 Reader will learn about Advance PySpark topics broadcast and accumulator. In this chapter learner will learn about PySpark application optimization. No of pages: Sub - Topics: 1. Spark Accumulator 2. Spark Broadcast 3. Spark Code Optimization
Chapter 7: IO in PySpark Chapter Goal: We will learn PySpark IO in this chapter. Reading and writing .csv file and .json files. We will also learn how to connect to different databases with PySpark. No of pages:20-30 Su b - Topics: 1. Reading and writing JSON and .csv files 2. Reading data from HDFS 3. Reading data from different databases and writing data to different databases
Chapter 8: PySpark Streaming Chapter Goal: Reader will understand real time data analysis with PySpark Streaming. This chapter is focus on PySpark Streaming architecture, Discretized stream operations and windowing operations. No of pages:30-40 Sub - Topics: 1. PySpark Streaming architecture 2. Discretized Stream and operations 3. Concept of windowing operations
Chapter 9: SparkSQL Chapter Goal: In this chapter reader will learn about SparkSQL. SparkSQL Dataframe is introduced in this chapter. In this chapter learner will learn how to use SQL commands using SparkSQL No of pages: 40-50 Sub - Topics: 1. SparkSQL 2. SQL with SparkSQL 3. Hive commands with SparkSQL