Talend for Data Integration and Big data
Secure enrollment now
Talend – A Revolution in Big Data
• Core ETL concepts
• Talend products and their features
• Design and implementation of Talend Open Studio
Working with Talend Open Studio for DI
• Launching Talend Studio
• Working with different workspace directories
• Working with projects
• Creating and executing jobs
• Connection types and triggers
• Most frequently used Talend components [tJava, tLogRow, tMap]
• Read & Write Various Types of Source/Target Systems
• Working with files [CSV, XLS, XML, Positional]
• Working with databases [MySQL DB]
• Metadata management
Basic Transformations in Talend
• Context Variables
• Using Talend components
• Accessing job level/ component level information within the job
• SubJob (using tRunJob, tPreJob, tPostJob)
Advance Transformations and Executing Jobs remotely in Talend
• Various components of file management (like tFileList, tFileAchive, tFileTouch, tFileDelete)
• Error Handling [tWarn, tDie]
• Type Casting (convert datatypes among source-target platforms)
• Looping components (like tLoop, tForeach)
• Using FTP components (like tFTPFileList, tFTPFileExists, tFTPGet, tFTPPut)
• Exporting and Importing Talend jobs
• How to schedule and run Talend DI jobs externally (using Command line)
• Parameterizing a Talend job from command line
Big Data and Hadoop with Talend
• Big Data and Hadoop
• HDFS and MapReduce
• Benefits of using Talend with Big Data
• Integration of Talend with Big Data
• HDFS commands Vs Talend HDFS utility
• Big Data setup using Hortonworks Sandbox in your personal computer
• Explaining the TOS for Big Data Environment
Hive in Talend
• Hive and It’s Architecture
• Connecting to Hive Shell
• Set connection to Hive database using Talend
• Create Hive Managed and external tables through Talend
• Load and Process Hive data using Talend
• Transform data from Hive using Talend
Pig and Kafka in Talend
• Pig Environment in Talend
• Pig Data Connectors
• Integrate Personalized Pig Code into a Talend job
• Apache Kafka
• Kafka Components in TOS for Big data
Complimentary sessions on communication presentation and leadership skills.
Benefits from the course
The increasing interest of IT market in Big Data has resulted in higher demand for jobs in this field.
If you are interested in pursuing your career in this field, you must go for Talend course. Talend training for data integration and big data will help you in learning how to use Talend open studio to simplify data integration.
Using Talend you can easily set up the communication with Big Data technologies without writing a single piece of code.
There are no prerequisites for learning Talend course in general.
However, having knowledge of Data Warehousing will be beneficial, but certainly not a mandate.
To brush up/ learn Data Warehousing concepts a complimentary self-paced course will be offered, i.e. "Data Warehousing Certification Training" when you enroll in Talend Certification course.
- There are no prerequisites for learning Talend course in general.
- However, having knowledge of Data Warehousing will be beneficial, but certainly not a mandate.
- To brush up/ learn Data Warehousing concepts a complimentary self-paced course will be offered, i.e.
- Data Warehousing Certification Training" when you enroll in Talend Certification course.
2 Hours Daytime slots or 3 Hours week end Slots (May change)