title
SSIS Tutorial For Beginners | SQL Server Integration Services (SSIS) | MSBI Training Video | Edureka

description
🔥 Microsoft BI Certification (Use Code "𝐘𝐎𝐔𝐓𝐔𝐁𝐄𝟐𝟎"): https://www.edureka.co/microsoft-bi-certification This Edureka SSIS Tutorial video will help you learn the basics of MSBI. SSIS is a platform for data integration and workflow applications. This video covers data warehousing concepts which is used for data extraction, transformation and loading (ETL). It is ideal for both beginners and professionals who want to brush up their basics of MSBI. This Edureka training video provides knowledge on the following topics: 1. Why do we need data integration? 2. What is data integration? 3. Why SSIS? 4. What is SSIS? 5. ETL process 6. Data Warehousing 7. Installation 8. What is SSIS Package? 9. Demo Subscribe to our channel to get video updates. Hit the subscribe button above. Check our complete Microsoft BI playlist here: https://goo.gl/Vo6Klo Edureka Elevate Program. Learn now, pay later: http://bit.ly/2vQKVu6 #SSIS #SSISTutorial #MicrosoftBI #MicrosoftBItutorial #MicrosoftBIcourse How it Works? 1. This is a 30 Hours of Online Live Instructor-Led Classes. Weekend Class : 10 sessions of 3 hours each. Weekday Class : 15 sessions of 2 hours each. 2. We have a 24x7 One-on-One LIVE Technical Support to help you with any problems you might face or any clarifications you may require during the course. 3. At the end of the training you will be working on a real time project for which we will provide you a Grade and a Verifiable Certificate! - - - - - - - - - - - - - - - - - About the Course Edureka's Microsoft BI Certification Course is designed to provide insights on different tools in Microsoft BI Suite (SQL Server Integration Services, SQL Server Analysis Services, SQL Server Reporting Services). Get expertise in SSIS , SSAS & SSRS concepts and master them. The course will give you the practical knowledge on Data Warehouse concepts and how these tools help in developing a robust end-to-end BI solution using the Microsoft BI Suite. Who should go for this course? Microsoft BI Certification Course at Edureka is designed for professionals aspiring to make a career in Business Intelligence. Software or Analytics professionals having background/experience of any RDBMS, ETL, OLAP or reporting tools are the key beneficiaries of this MSBI course. You can check a blog related to Microsoft BI – Why You Need It For A Better Business Intelligence Career!! Also, once your Microsoft BI training is over, you can check the Microsoft Business Intelligence Interview Questions related edureka blog. Why learn Microsoft BI ? As we move from experience and intuition based decision making to actual decision making, it is increasingly important to capture data and store it in a way that allows us to make smarter decisions. This is where Data warehouse/Business Intelligence comes into picture. There is a huge demand for Business Intelligence professionals and this course acts as a foundation which opens the door to a variety of opportunities in Business Intelligence space. Though there are many vendors providing BI tools, very few of them provide end to end BI suite and huge customer base. Microsoft stands as leader with its user-friendly and cost effective Business Intelligence suite helping customers to get a 360 degree view of their businesses. Please write back to us at sales@edureka.co or call us at +918880862004 or 18002759730 for more information. Website: https://www.edureka.co/microsoft-bi Facebook: https://www.facebook.com/edurekaIN/ Twitter: https://twitter.com/edurekain LinkedIn: https://www.linkedin.com/company/edureka Customer Reviews: Amit Vij, HRSSC HRIS Senior Advisor at DLA Piper, says "I am not a big fan of online courses and also opted for class room based training sessions in past. Out of surprise, I had a WoW factor when I attended first session of my MSBI course with Edureka. Presentation - Check, Faculty - Check, Voice Clarity - Check, Course Content - Check, Course Schedule and Breaks - Check, Revisting Past Modules - Awesome with a big check. I like the way classes were organised and faculty was far above beyond expectations. I will recommend Edureka to everyone and will personally revisit them for my future learnings."

detail
{'title': 'SSIS Tutorial For Beginners | SQL Server Integration Services (SSIS) | MSBI Training Video | Edureka', 'heatmap': [{'end': 1232.674, 'start': 1193.864, 'weight': 0.897}, {'end': 1451.046, 'start': 1408.253, 'weight': 0.729}], 'summary': 'This tutorial covers ssis, data integration, benefits of ssis, etl process, data warehouse basics, ssis installation, package creation, sql table transformation, and the etl process, with a demonstration of integrating real-life data and loading 500 rows into a database, highlighting the advantages of data integration and ssis components.', 'chapters': [{'end': 125.81, 'segs': [{'end': 51.407, 'src': 'embed', 'start': 21.762, 'weight': 0, 'content': [{'end': 24.363, 'text': 'Now data integration is achieved using various tools.', 'start': 21.762, 'duration': 2.601}, {'end': 31.107, 'text': 'So my next topic would be why SSIS came into the picture and why it is preferred over the other data integration tools.', 'start': 24.983, 'duration': 6.124}, {'end': 36.392, 'text': "Next we'll be looking deep into what is SSIS and what are the different components of SSIS.", 'start': 31.787, 'duration': 4.605}, {'end': 44.18, 'text': "So we'll be learning about ETL data warehousing followed by a quick installation of Visual Studio and data tools needed for the same.", 'start': 37.013, 'duration': 7.167}, {'end': 51.407, 'text': "Moving towards the later half of the session, we'll be discussing what is a SSIS package and how you can create a package in Visual Studio.", 'start': 44.72, 'duration': 6.687}], 'summary': 'Ssis is a preferred data integration tool; covers etl, data warehousing, and package creation.', 'duration': 29.645, 'max_score': 21.762, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw21762.jpg'}, {'end': 88.471, 'src': 'embed', 'start': 59.01, 'weight': 2, 'content': [{'end': 60.831, 'text': 'So I hope you guys are clear with the agenda.', 'start': 59.01, 'duration': 1.821}, {'end': 63.912, 'text': 'Kindly drop me a quick confirmation so that I can proceed.', 'start': 61.371, 'duration': 2.541}, {'end': 66.552, 'text': 'Okay, Anna says yes.', 'start': 63.932, 'duration': 2.62}, {'end': 68.153, 'text': 'So does Sam.', 'start': 67.473, 'duration': 0.68}, {'end': 70.214, 'text': 'Maybe give me a thumbs up.', 'start': 69.193, 'duration': 1.021}, {'end': 73.757, 'text': 'All right, so I got confirmations from many of you.', 'start': 70.915, 'duration': 2.842}, {'end': 75.299, 'text': "So let's begin the session.", 'start': 74.118, 'duration': 1.181}, {'end': 83.286, 'text': 'So my first topic is why do we need data integration? So let me take example here as to understand the need of data integration.', 'start': 75.999, 'duration': 7.287}, {'end': 88.471, 'text': "So if you look at the image here, suppose I'm in a company where I have different departments.", 'start': 83.707, 'duration': 4.764}], 'summary': 'Agenda confirmed by anna and sam, proceeding with data integration session', 'duration': 29.461, 'max_score': 59.01, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw59010.jpg'}, {'end': 136.515, 'src': 'embed', 'start': 105.325, 'weight': 1, 'content': [{'end': 107.727, 'text': 'Similarly, different team uses different database.', 'start': 105.325, 'duration': 2.402}, {'end': 109.969, 'text': 'My marketing team is using Oracle.', 'start': 108.268, 'duration': 1.701}, {'end': 112.451, 'text': 'Manufacturing is using DB2 and many more.', 'start': 110.229, 'duration': 2.222}, {'end': 115.194, 'text': 'So all this depends on the requirements of your company.', 'start': 112.752, 'duration': 2.442}, {'end': 119.241, 'text': 'Now I have different databases where my different type of data is stored.', 'start': 115.956, 'duration': 3.285}, {'end': 125.81, 'text': "But let's say my manager asked me to analyze all the departments and tell me who brings the best revenue out of all the teams.", 'start': 119.661, 'duration': 6.149}, {'end': 127.152, 'text': 'So what would I do now?', 'start': 126.351, 'duration': 0.801}, {'end': 128.755, 'text': 'Any idea, guys?', 'start': 127.813, 'duration': 0.942}, {'end': 132.2, 'text': 'if you could suggest me with something or you could help me with this current problem?', 'start': 128.755, 'duration': 3.445}, {'end': 136.515, 'text': 'Okay, so Sam tells me you can connect the databases.', 'start': 133.894, 'duration': 2.621}], 'summary': 'Different teams use varied databases. marketing uses oracle, manufacturing uses db2. analyzing revenue across departments requires connecting databases.', 'duration': 31.19, 'max_score': 105.325, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw105325.jpg'}], 'start': 0.069, 'title': 'Ssis tutorial: data integration', 'summary': 'Discusses the need for data integration, the role of ssis in integrating different databases, and a demonstration of integrating real-life data using ssis, followed by audience confirmation for the agenda.', 'chapters': [{'end': 125.81, 'start': 0.069, 'title': 'Ssis tutorial: data integration with microsoft', 'summary': 'Discusses the need for data integration, the role of ssis in integrating different databases, and a demonstration of integrating real-life data using ssis, followed by audience confirmation for the agenda.', 'duration': 125.741, 'highlights': ['SSIS is preferred over other data integration tools SSIS is discussed as the preferred tool for data integration, indicating its superiority over other tools.', 'Different teams using different databases based on company requirements The example of different company departments using different databases (SAP, Salesforce CRM, Oracle, DB2, etc.) is provided to illustrate the diverse data storage needs within a company.', 'Audience confirmation received before proceeding with the session The speaker requests audience confirmation for the agenda, and multiple confirmations are received, indicating active participation and engagement.']}], 'duration': 125.741, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw69.jpg', 'highlights': ['SSIS is preferred over other data integration tools', 'Different teams using different databases based on company requirements', 'Audience confirmation received before proceeding with the session']}, {'end': 378.507, 'segs': [{'end': 176.753, 'src': 'embed', 'start': 147.44, 'weight': 1, 'content': [{'end': 151.842, 'text': 'Moreover, dealing with these connected databases will create more complexities for you.', 'start': 147.44, 'duration': 4.402}, {'end': 158.285, 'text': 'Because if you have large data, say you have 100 databases, and to connect all of them, it will consume a lot of time.', 'start': 152.142, 'duration': 6.143}, {'end': 161.987, 'text': "So I hope you understand why I'm not saying to connect the databases.", 'start': 158.825, 'duration': 3.162}, {'end': 168.945, 'text': 'Okay, but now what would the solution then? Here a simple solution would be data integration.', 'start': 162.938, 'duration': 6.007}, {'end': 176.753, 'text': 'Now by data integration what I mean is you can integrate all your data present in different databases and combine them at the same platform.', 'start': 169.625, 'duration': 7.128}], 'summary': 'Connecting 100 databases will consume a lot of time, so data integration is the solution.', 'duration': 29.313, 'max_score': 147.44, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw147440.jpg'}, {'end': 215.673, 'src': 'embed', 'start': 188.906, 'weight': 0, 'content': [{'end': 193.067, 'text': 'So data integration is a process you follow to get data from multiple sources.', 'start': 188.906, 'duration': 4.161}, {'end': 195.028, 'text': 'Your data can be in any form.', 'start': 193.688, 'duration': 1.34}, {'end': 197.388, 'text': 'It can be heterogeneous or it can be homogeneous.', 'start': 195.148, 'duration': 2.24}, {'end': 203.97, 'text': 'Now by these terms, I mean data can be in structured form, it can be semi-structured form, it can be unstructured.', 'start': 198.069, 'duration': 5.901}, {'end': 206.27, 'text': 'So these are dissimilar data.', 'start': 204.81, 'duration': 1.46}, {'end': 215.673, 'text': "But if these dissimilar data combines together into meaningful and valuable information, wouldn't that be great? So this exactly is data integration.", 'start': 206.831, 'duration': 8.842}], 'summary': 'Data integration is the process of combining dissimilar data from multiple sources into meaningful and valuable information.', 'duration': 26.767, 'max_score': 188.906, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw188906.jpg'}, {'end': 293.935, 'src': 'embed', 'start': 227.216, 'weight': 3, 'content': [{'end': 232.942, 'text': 'So here I have listed few ways from which you can achieve data integration, such as data modeling,', 'start': 227.216, 'duration': 5.726}, {'end': 236.165, 'text': 'where you first create a model and then perform operations on it.', 'start': 232.942, 'duration': 3.223}, {'end': 244.153, 'text': 'Then there is data profiling, where you take a sample data and check if there is some inconsistencies, errors, or some variations to it.', 'start': 236.825, 'duration': 7.328}, {'end': 248.377, 'text': 'Similarly, there are different methods from which you can achieve data integration.', 'start': 244.813, 'duration': 3.564}, {'end': 253.029, 'text': 'Now it is not necessary to achieve data integration using a single process.', 'start': 249.147, 'duration': 3.882}, {'end': 255.41, 'text': 'There may be a collection of different processes.', 'start': 253.589, 'duration': 1.821}, {'end': 259.452, 'text': 'Therefore there is no limitation as to how you achieve data integration.', 'start': 256.05, 'duration': 3.402}, {'end': 262.654, 'text': "Now let's see some advantages of data integration.", 'start': 260.293, 'duration': 2.361}, {'end': 266.276, 'text': 'So the first advantage is it reduces complexity.', 'start': 263.514, 'duration': 2.762}, {'end': 270.678, 'text': 'Now by saying this I mean it is easy to deliver data to any system.', 'start': 267.096, 'duration': 3.582}, {'end': 279.139, 'text': 'So data integration is all about managing complexity, streamlining these connections, and making it easy to deliver data to any system.', 'start': 271.892, 'duration': 7.247}, {'end': 282.129, 'text': 'Now the second advantage is data integrity.', 'start': 280.168, 'duration': 1.961}, {'end': 285.191, 'text': 'Now integrity has a major role in data integration.', 'start': 282.709, 'duration': 2.482}, {'end': 289.833, 'text': 'So data integrity basically deals with cleansing and validating your data.', 'start': 285.871, 'duration': 3.962}, {'end': 293.935, 'text': 'So all of us need our data to be high quality and robust, right?', 'start': 290.673, 'duration': 3.262}], 'summary': 'Various methods for achieving data integration, with advantages like reduced complexity and improved data integrity.', 'duration': 66.719, 'max_score': 227.216, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw227216.jpg'}, {'end': 345.643, 'src': 'embed', 'start': 316.732, 'weight': 6, 'content': [{'end': 318.814, 'text': 'So here the data collaboration factor lies.', 'start': 316.732, 'duration': 2.082}, {'end': 321.798, 'text': 'Last but not the least, smarter business decision.', 'start': 319.435, 'duration': 2.363}, {'end': 328.404, 'text': 'Now you can make smarter decisions as your integrated data also refers to transparent processes within your company.', 'start': 322.518, 'duration': 5.886}, {'end': 332.752, 'text': 'So here you are given the opportunity to better understand the information.', 'start': 329.145, 'duration': 3.607}, {'end': 335.457, 'text': 'Hence it is much easier and more informative.', 'start': 333.293, 'duration': 2.164}, {'end': 339.545, 'text': 'So based on the information collected, you can actually make smarter decisions.', 'start': 335.958, 'duration': 3.587}, {'end': 345.643, 'text': 'So I hope now you guys are clear with the need, what it is, and different advantages of data integration.', 'start': 340.779, 'duration': 4.864}], 'summary': 'Data integration enables smarter business decisions and better understanding of information, leading to transparent processes and improved decision-making.', 'duration': 28.911, 'max_score': 316.732, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw316732.jpg'}], 'start': 126.351, 'title': 'Data integration and its advantages', 'summary': 'Introduces the concept of data integration, emphasizing complexities and time consumption of connecting multiple databases, and proposes a solution. it also discusses various methods of achieving data integration, including data modeling and profiling, highlighting its advantages such as reducing complexity, ensuring data integrity, promoting data collaboration, and enabling smarter business decisions.', 'chapters': [{'end': 226.696, 'start': 126.351, 'title': 'Data integration: simplifying data management', 'summary': 'Introduces the concept of data integration and its benefits, emphasizing the complexities and time consumption of connecting multiple databases, and proposing a solution to integrate data from different sources to create meaningful and valuable information.', 'duration': 100.345, 'highlights': ['Data integration is a process used to obtain data from multiple sources, including structured, semi-structured, and unstructured forms, and combining them to generate meaningful and valuable information.', 'Connecting multiple databases can be time-consuming, especially when dealing with a large amount of data, such as 100 databases.', 'The complexities of dealing with connected databases are highlighted, indicating the potential challenges and limitations of this approach.', 'The proposed solution of data integration is presented as a simpler and more efficient approach to managing data from diverse sources.']}, {'end': 378.507, 'start': 227.216, 'title': 'Data integration: methods and advantages', 'summary': 'Discusses various methods of achieving data integration, including data modeling and profiling, and highlights the advantages of data integration, such as reducing complexity, ensuring data integrity, promoting data collaboration, and enabling smarter business decisions.', 'duration': 151.291, 'highlights': ['Data integration methods include data modeling and profiling. Data modeling involves creating a model and performing operations on it, while data profiling involves checking sample data for inconsistencies, errors, or variations.', 'Advantage of data integration: Reducing complexity and ensuring data integrity. Data integration reduces complexity by streamlining connections and making it easy to deliver data to any system. Data integrity deals with cleansing and validating data to ensure it is error-free and consistent.', 'Advantage of data integration: Promoting data collaboration and enabling smarter business decisions. Data collaboration is made easier with accessible data, leading to easier collaboration and sharing of results. Integrated data facilitates smarter business decisions by providing transparent processes and better understanding of information.']}], 'duration': 252.156, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw126351.jpg', 'highlights': ['Data integration involves obtaining data from multiple sources and combining them to generate valuable information.', 'Connecting multiple databases can be time-consuming, especially when dealing with a large amount of data.', 'The complexities and potential challenges of dealing with connected databases are highlighted.', 'The proposed solution of data integration is presented as a simpler and more efficient approach to managing data from diverse sources.', 'Data modeling involves creating a model and performing operations on it, while data profiling involves checking sample data for inconsistencies, errors, or variations.', 'Advantage of data integration: Reducing complexity and ensuring data integrity.', 'Advantage of data integration: Promoting data collaboration and enabling smarter business decisions.']}, {'end': 559.97, 'segs': [{'end': 494.744, 'src': 'embed', 'start': 379.567, 'weight': 0, 'content': [{'end': 384.991, 'text': "Now let's just focus on Microsoft SQL Server Integration Services that is SSIS.", 'start': 379.567, 'duration': 5.424}, {'end': 390.796, 'text': "So before going to what exactly is SSIS, first let's understand the why part.", 'start': 386.352, 'duration': 4.444}, {'end': 397.945, 'text': 'So why SSIS? So now my first point is data can be loaded in parallel to many varied destinations.', 'start': 391.136, 'duration': 6.809}, {'end': 402.477, 'text': "So as we've already learned that it collects data from multiple sources.", 'start': 398.707, 'duration': 3.77}, {'end': 409.639, 'text': 'Now SSIS is responsible for connecting to each data source, extracting the data, and merging it into a single data set.', 'start': 403.277, 'duration': 6.362}, {'end': 412.399, 'text': 'So this is how SSIS plays an important role.', 'start': 410.019, 'duration': 2.38}, {'end': 415.5, 'text': 'Next, it removes the need of hardcore programmers.', 'start': 412.919, 'duration': 2.581}, {'end': 423.422, 'text': 'This is because SSIS includes the capability to load large amount of data directly from the flat file or Excel file in SQL Server.', 'start': 416.06, 'duration': 7.362}, {'end': 427.104, 'text': 'Next we have tight integration with all the products of Microsoft.', 'start': 424.022, 'duration': 3.082}, {'end': 431.787, 'text': 'Also the best part about SSIS is it is cheaper than most of the other tools.', 'start': 427.684, 'duration': 4.103}, {'end': 438.411, 'text': 'If you compare it with respect to base product, their manageability, business intelligence, availability, and multi-core.', 'start': 432.347, 'duration': 6.064}, {'end': 444.855, 'text': "So these are few of the points that I've listed as to why you prefer SSIS over other data integration tools.", 'start': 438.991, 'duration': 5.864}, {'end': 448.097, 'text': "Now let's understand what exactly is SSIS.", 'start': 445.895, 'duration': 2.202}, {'end': 455.163, 'text': 'So as we all know, SSIS is a service of Microsoft that basically performs data integration.', 'start': 449.861, 'duration': 5.302}, {'end': 464.166, 'text': 'Or you can say merging of data from different data sources, which can be from flat file, it can be from Excel, it can be from SAP, Oracle or anything.', 'start': 455.683, 'duration': 8.483}, {'end': 469.928, 'text': 'So it is basically used to perform a broad range of data integration as well as data transformation tasks.', 'start': 464.806, 'duration': 5.122}, {'end': 473.989, 'text': 'So in a whole, you can say it basically performs data migration.', 'start': 470.608, 'duration': 3.381}, {'end': 479.114, 'text': 'So SSIS is a platform for data integration and workflow applications.', 'start': 474.892, 'duration': 4.222}, {'end': 485.558, 'text': 'So by data integration, we already know that data is retrieved and combined in a structure which has a unified view.', 'start': 479.895, 'duration': 5.663}, {'end': 487.559, 'text': 'Next we have workflow.', 'start': 486.359, 'duration': 1.2}, {'end': 489.941, 'text': 'Now a workflow can do several things.', 'start': 488.08, 'duration': 1.861}, {'end': 494.744, 'text': 'Sometimes you just need some steps or path in the package execution,', 'start': 490.561, 'duration': 4.183}], 'summary': 'Ssis enables parallel data loading, integration with microsoft products, and cost-effectiveness.', 'duration': 115.177, 'max_score': 379.567, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw379567.jpg'}, {'end': 539.926, 'src': 'embed', 'start': 516.9, 'weight': 7, 'content': [{'end': 524.442, 'text': "So we'll be talking about SSIS package in further more slides wherein I'll be teaching you how you can create an SSIS package in your Visual Studio.", 'start': 516.9, 'duration': 7.542}, {'end': 528.043, 'text': 'So SSIS consists of three major components.', 'start': 525.542, 'duration': 2.501}, {'end': 533.444, 'text': 'The first is operational data followed by an ETL process and then the data warehouse.', 'start': 528.623, 'duration': 4.821}, {'end': 536.845, 'text': "So let's understand each one of them in detail now.", 'start': 534.485, 'duration': 2.36}, {'end': 539.926, 'text': 'So the first component is operational data.', 'start': 537.806, 'duration': 2.12}], 'summary': 'Ssis package consists of three major components: operational data, etl process, and data warehouse.', 'duration': 23.026, 'max_score': 516.9, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw516900.jpg'}], 'start': 379.567, 'title': 'Benefits of ssis and key components', 'summary': 'Emphasizes the importance and benefits of ssis, such as parallel data loading, integration with microsoft products, cost-effectiveness, and key components including operational data and etl process.', 'chapters': [{'end': 423.422, 'start': 379.567, 'title': 'Ssis: key functions and benefits', 'summary': 'Explains the importance of ssis in loading data in parallel to varied destinations, extracting and merging data from multiple sources, and eliminating the need for hardcore programmers by enabling direct loading of large data from flat files or excel into sql server.', 'duration': 43.855, 'highlights': ['SSIS enables loading data in parallel to varied destinations, enhancing efficiency and speed of data transfer.', 'It connects to multiple data sources, extracts data, and merges it into a single data set, streamlining the data integration process.', 'SSIS eliminates the necessity for hardcore programmers by facilitating the direct loading of large data from flat files or Excel into SQL Server.']}, {'end': 559.97, 'start': 424.022, 'title': 'Benefits of ssis and its key components', 'summary': 'Highlights the benefits of ssis including its tight integration with microsoft products, cost-effectiveness, data integration capabilities, and workflow applications. it also explains the major components of ssis including operational data, etl process, and data warehouse.', 'duration': 135.948, 'highlights': ['SSIS is cheaper than most other tools SSIS is cost-effective compared to other tools, making it a preferred choice for data integration. This cost advantage can be quantified by comparing its pricing with that of other data integration tools.', "SSIS is tightly integrated with all Microsoft products The tight integration of SSIS with Microsoft products enhances its manageability, business intelligence, and availability. This integration can be quantified by assessing the seamless compatibility and interoperability of SSIS with Microsoft's suite of products.", 'SSIS performs data integration from various sources including flat files, Excel, SAP, and Oracle SSIS has the capability to merge and integrate data from diverse sources, such as flat files, Excel, SAP, and Oracle. This broad range of data integration can be quantified by the number of different data sources it can effectively handle.', 'SSIS is a platform for data integration and workflow applications SSIS serves as a platform for both data integration and workflow applications, providing a unified view of data and enabling the execution of customizable workflows. This capability can be quantified by the number of diverse data integration and workflow tasks supported by SSIS.', 'SSIS consists of major components including operational data, ETL process, and data warehouse SSIS is comprised of key components such as operational data, ETL (Extract, Transform, Load) process, and data warehouse. Understanding these components is essential for comprehending the functionality and scope of SSIS. The quantifiable nature of these components can be understood through their respective roles in data management and integration.']}], 'duration': 180.403, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw379567.jpg', 'highlights': ['SSIS enables parallel data loading to varied destinations, enhancing efficiency and speed.', 'SSIS connects to multiple data sources, extracts data, and merges it into a single data set, streamlining the integration process.', 'SSIS eliminates the necessity for hardcore programmers by facilitating direct loading of large data from flat files or Excel into SQL Server.', 'SSIS is cost-effective compared to other tools, making it a preferred choice for data integration.', 'The tight integration of SSIS with Microsoft products enhances its manageability, business intelligence, and availability.', 'SSIS has the capability to merge and integrate data from diverse sources, such as flat files, Excel, SAP, and Oracle.', 'SSIS serves as a platform for both data integration and workflow applications, providing a unified view of data and enabling the execution of customizable workflows.', 'SSIS consists of key components such as operational data, ETL process, and data warehouse, essential for comprehending its functionality and scope.']}, {'end': 873.372, 'segs': [{'end': 690.891, 'src': 'embed', 'start': 582.338, 'weight': 0, 'content': [{'end': 587.562, 'text': 'So ETL is a process responsible for pulling data out of the source which can be of any format.', 'start': 582.338, 'duration': 5.224}, {'end': 591.685, 'text': 'It can be XML, flat file and placing the whole into a data warehouse.', 'start': 587.662, 'duration': 4.023}, {'end': 599.451, 'text': 'Also, an ETL process ensures that the data stored in the warehouse is relevant, it is useful to the business users,', 'start': 592.365, 'duration': 7.086}, {'end': 601.412, 'text': 'it is accurate and it is high quality.', 'start': 599.451, 'duration': 1.961}, {'end': 607.457, 'text': 'Also it is easy to access so that the warehouse is used efficiently and effectively by the business users.', 'start': 601.892, 'duration': 5.565}, {'end': 617.005, 'text': 'So ETL helps an organization to make meaningful data-driven decisions by interpreting and transforming large amount of structured and unstructured data.', 'start': 608.137, 'duration': 8.868}, {'end': 622.369, 'text': 'Even though ETL is a three-word concept, but it is actually divided into four phases.', 'start': 618.206, 'duration': 4.163}, {'end': 624.871, 'text': 'So the first phase is capture.', 'start': 623.23, 'duration': 1.641}, {'end': 627.253, 'text': 'It is also known as an extract phase.', 'start': 625.412, 'duration': 1.841}, {'end': 633.559, 'text': 'So in this phase, it basically picks the source data or metadata which can be present in any format.', 'start': 628.134, 'duration': 5.425}, {'end': 636.297, 'text': 'So the next process is scrub.', 'start': 634.875, 'duration': 1.422}, {'end': 639.964, 'text': 'So scrub basically identify errors in your original data.', 'start': 636.738, 'duration': 3.226}, {'end': 646.795, 'text': 'For checking these errors and inconsistencies, it uses some artificial intelligence techniques to verify the quality of the data.', 'start': 640.284, 'duration': 6.511}, {'end': 653.556, 'text': 'So it verifies quality of the data and basically ensures that the quality of the data is met or not.', 'start': 647.912, 'duration': 5.644}, {'end': 655.618, 'text': 'Third, we have transformation.', 'start': 654.197, 'duration': 1.421}, {'end': 661.602, 'text': 'So transformation is another process where your source data is converted to the required format you want.', 'start': 656.118, 'duration': 5.484}, {'end': 666.085, 'text': 'So transformation is modeling or changing your data to meet the requirements.', 'start': 662.443, 'duration': 3.642}, {'end': 669.608, 'text': 'It can be with respect to number of rows and column processing.', 'start': 666.846, 'duration': 2.762}, {'end': 674.091, 'text': 'So if you want to increase the number of rows or columns, you can transform it accordingly.', 'start': 670.208, 'duration': 3.883}, {'end': 677.114, 'text': 'Next final stage is load and index.', 'start': 675.052, 'duration': 2.062}, {'end': 684.043, 'text': 'So in this stage it loads the data and validates number of rows that is processed meets the required number of rows.', 'start': 678.055, 'duration': 5.988}, {'end': 690.891, 'text': 'Once your loading is done indexing helps you track the number of rows or the amount of data you are loading into the warehouse.', 'start': 684.904, 'duration': 5.987}], 'summary': 'Etl process ensures data relevance, accuracy, and quality, facilitating meaningful data-driven decisions. it is divided into four phases: capture, scrub, transformation, and load/index.', 'duration': 108.553, 'max_score': 582.338, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw582338.jpg'}, {'end': 761.731, 'src': 'embed', 'start': 730.69, 'weight': 5, 'content': [{'end': 732.611, 'text': 'But these terms are extremely important for you.', 'start': 730.69, 'duration': 1.921}, {'end': 739.656, 'text': 'So once you understand what a data warehouse is, what a BI system is, all this will become secondary and it will naturally come to you.', 'start': 733.272, 'duration': 6.384}, {'end': 748.922, 'text': 'So data warehouse is a technique where we pull the data or assemble the data from various sources and combine them in order to answer the question which business users want?', 'start': 740.336, 'duration': 8.586}, {'end': 753.285, 'text': 'So stop me if you have any doubts in these terms or anything related to data warehouse.', 'start': 749.542, 'duration': 3.743}, {'end': 761.731, 'text': 'Alright, so Bebek here asked me a question, is data warehouse different from a database? Well, the answer to your question is yes, no both.', 'start': 754.225, 'duration': 7.506}], 'summary': 'Data warehouse is a technique to combine data from various sources to answer business questions.', 'duration': 31.041, 'max_score': 730.69, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw730690.jpg'}, {'end': 795.708, 'src': 'embed', 'start': 771.279, 'weight': 6, 'content': [{'end': 779.085, 'text': 'because data warehouse is a structure where analytics related queries can be fired and you can get faster query responses if you compare it with a database.', 'start': 771.279, 'duration': 7.806}, {'end': 780.947, 'text': 'So I hope you got an answer Bebek.', 'start': 779.626, 'duration': 1.321}, {'end': 783.883, 'text': "Okay, so he's saying he's clear.", 'start': 782.523, 'duration': 1.36}, {'end': 789.445, 'text': "Alright, so now let's just quickly go ahead and understand the architecture of data warehouse.", 'start': 783.903, 'duration': 5.542}, {'end': 792.847, 'text': 'So if you look at the diagram, it is read from left to right.', 'start': 790.146, 'duration': 2.701}, {'end': 795.708, 'text': 'On the left hand side, you have your operational data.', 'start': 793.587, 'duration': 2.121}], 'summary': 'Data warehouse enables faster analytics queries compared to databases.', 'duration': 24.429, 'max_score': 771.279, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw771279.jpg'}, {'end': 863.167, 'src': 'embed', 'start': 835.328, 'weight': 7, 'content': [{'end': 838.031, 'text': 'From data warehouse, data goes into a cube sort of structure.', 'start': 835.328, 'duration': 2.703}, {'end': 840.813, 'text': 'So cube is categorized as OLAP structure.', 'start': 838.631, 'duration': 2.182}, {'end': 844.196, 'text': 'So OLAP here refers to online analytical processing.', 'start': 841.233, 'duration': 2.963}, {'end': 848.039, 'text': 'This comes in the SSAS, where all your data analysis is done.', 'start': 844.776, 'duration': 3.263}, {'end': 851.522, 'text': "Don't worry guys, we'll be talking about SSAS in the next session.", 'start': 848.599, 'duration': 2.923}, {'end': 857.945, 'text': 'Now, once you have this OLAP structure, you just have to build in your reports on it and export it to business users,', 'start': 852.142, 'duration': 5.803}, {'end': 859.986, 'text': 'and then they can go ahead and make the analysis.', 'start': 857.945, 'duration': 2.041}, {'end': 863.167, 'text': 'So this is a typical data warehouse architecture.', 'start': 860.586, 'duration': 2.581}], 'summary': 'Data warehouse uses olap structure for ssas data analysis and reporting.', 'duration': 27.839, 'max_score': 835.328, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw835328.jpg'}], 'start': 560.956, 'title': 'Etl and data warehouse basics', 'summary': 'Explains the etl process, responsible for pulling data out of the source and placing it into a data warehouse to ensure relevance, accuracy, and high quality. it also discusses the basics of data warehouse, including its purpose, architecture, and its comparison with a database, emphasizing the importance of understanding these concepts for effective analysis and reporting.', 'chapters': [{'end': 690.891, 'start': 560.956, 'title': 'Etl process: extract, transform, load', 'summary': 'Explains the etl process, responsible for pulling data out of the source and placing it into a data warehouse to ensure relevance, accuracy, and high quality, thus enabling organizations to make meaningful data-driven decisions. it is divided into four phases: capture, scrub, transformation, and load and index.', 'duration': 129.935, 'highlights': ['The ETL process ensures that the data stored in the warehouse is relevant, accurate, high quality, and easy to access so that the warehouse is used efficiently and effectively by the business users, enabling organizations to make meaningful data-driven decisions by interpreting and transforming large amounts of structured and unstructured data.', 'The ETL process is divided into four phases: capture (extract phase), scrub (identify errors and inconsistencies), transformation (convert source data to the required format), and load and index (load the data, validate the number of rows processed, and track the amount of data loaded into the warehouse).', 'Scrub phase uses artificial intelligence techniques to verify the quality of the data, ensuring that the quality of the data is met or not, by checking for errors and inconsistencies in the original data.', 'Transformation phase involves modeling or changing the data to meet the requirements, including altering the number of rows or columns to meet specific processing needs.', 'Load and index stage loads the data and validates the number of rows processed meets the required number, while indexing helps track the amount of data loaded into the warehouse.']}, {'end': 873.372, 'start': 691.772, 'title': 'Data warehouse basics', 'summary': 'Discusses the basics of data warehouse, including its purpose, architecture, and its comparison with a database, emphasizing the importance of understanding these concepts for effective analysis and reporting.', 'duration': 181.6, 'highlights': ['Data warehouse is a single complete and consistent store of data formulated by combining data from various sources to answer BI questions. Data warehouse is a single complete and consistent store of data formulated by combining data from various sources to answer BI questions.', 'Comparison between data warehouse and a database, highlighting the structure and faster query responses of a data warehouse for analytics. Data warehouse is also a database in terms of physical representation, but it differs in its structure and provides faster query responses for analytics.', 'Explanation of the architecture, including ETL process, OLAP structure, and SSAS for data analysis and reporting. The architecture involves the ETL process for data extraction, transformation, and loading, followed by an OLAP structure for data analysis in SSAS and reporting to business users.']}], 'duration': 312.416, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw560956.jpg', 'highlights': ['The ETL process ensures relevant, accurate, high-quality data in the warehouse for efficient use by business users.', 'The ETL process is divided into four phases: capture, scrub, transformation, and load and index.', 'Scrub phase uses AI techniques to verify data quality by checking for errors and inconsistencies.', 'Transformation phase involves modeling or changing data to meet specific processing needs.', 'Load and index stage validates the number of rows processed and tracks the amount of data loaded into the warehouse.', 'Data warehouse combines data from various sources to answer BI questions.', 'Data warehouse provides faster query responses for analytics compared to a database.', 'Architecture involves ETL process, OLAP structure, and SSAS for data analysis and reporting.']}, {'end': 1719.259, 'segs': [{'end': 942.244, 'src': 'embed', 'start': 915.68, 'weight': 0, 'content': [{'end': 920.986, 'text': 'The first is free trial evaluation, second is developer edition, and third is express edition.', 'start': 915.68, 'duration': 5.306}, {'end': 925.01, 'text': 'So the free trial evaluation gives a free trial only for 180 days.', 'start': 921.646, 'duration': 3.364}, {'end': 927.013, 'text': 'Next is your developer edition.', 'start': 925.791, 'duration': 1.222}, {'end': 932.118, 'text': 'Now this edition deals with all your development and test database in a non-production environment.', 'start': 927.473, 'duration': 4.645}, {'end': 934.42, 'text': "Thirdly, it's your express edition.", 'start': 932.739, 'duration': 1.681}, {'end': 939.082, 'text': 'It also deals with your development and production but it is for small server applications.', 'start': 934.56, 'duration': 4.522}, {'end': 942.244, 'text': 'So you can download any one of them based on your requirements.', 'start': 939.803, 'duration': 2.441}], 'summary': 'Free trial: 180-day trial, developer: non-production environment, express: small server applications.', 'duration': 26.564, 'max_score': 915.68, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw915680.jpg'}, {'end': 1022.589, 'src': 'embed', 'start': 996.584, 'weight': 1, 'content': [{'end': 1001.467, 'text': 'You just have to click on it and again the installation process will be the same.', 'start': 996.584, 'duration': 4.883}, {'end': 1003.368, 'text': 'So here this is the setup.', 'start': 1002.287, 'duration': 1.081}, {'end': 1005.309, 'text': 'You can go on and install it.', 'start': 1003.808, 'duration': 1.501}, {'end': 1009.032, 'text': 'Now you must be wondering why I have downloaded two different tools for it.', 'start': 1006.05, 'duration': 2.982}, {'end': 1014.082, 'text': 'So SQL Server 2014 is the actual server RDBMS.', 'start': 1009.699, 'duration': 4.383}, {'end': 1018.606, 'text': 'So whenever you install it you can go and create the database or create table into it.', 'start': 1014.523, 'duration': 4.083}, {'end': 1022.589, 'text': 'Next we have downloaded the data tools which is used to create the MSPA project.', 'start': 1018.986, 'duration': 3.603}], 'summary': 'Downloaded sql server 2014 for rdbms and data tools for mspa project creation.', 'duration': 26.005, 'max_score': 996.584, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw996584.jpg'}, {'end': 1069.377, 'src': 'embed', 'start': 1040.578, 'weight': 4, 'content': [{'end': 1043.119, 'text': 'Now code here not refers to any programming language.', 'start': 1040.578, 'duration': 2.541}, {'end': 1044.601, 'text': "It's the development you do.", 'start': 1043.56, 'duration': 1.041}, {'end': 1047.321, 'text': 'So the development is done inside the package.', 'start': 1045.141, 'duration': 2.18}, {'end': 1052.766, 'text': 'So SSIS is essentially for ETL and the package will go ahead and do the ETL process.', 'start': 1047.923, 'duration': 4.843}, {'end': 1054.948, 'text': 'Now a package will have some connections.', 'start': 1053.346, 'duration': 1.602}, {'end': 1058.69, 'text': 'Now these connections will help you connect to the various data sources.', 'start': 1055.388, 'duration': 3.302}, {'end': 1062.433, 'text': 'It will have control flow elements and data flow elements as well.', 'start': 1059.571, 'duration': 2.862}, {'end': 1065.795, 'text': 'So these two components combine together to form a package.', 'start': 1062.953, 'duration': 2.842}, {'end': 1069.377, 'text': 'Now here your control flow elements handles your workflow.', 'start': 1066.535, 'duration': 2.842}], 'summary': 'Ssis is used for etl, involving packages with connections, control flow, and data flow elements.', 'duration': 28.799, 'max_score': 1040.578, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1040578.jpg'}, {'end': 1125.508, 'src': 'embed', 'start': 1101.6, 'weight': 2, 'content': [{'end': 1108.284, 'text': 'Now remember I am taking my file input as CSV which is a comma separated file but you can take it from any other source as well.', 'start': 1101.6, 'duration': 6.684}, {'end': 1112.827, 'text': 'After that I will perform some transformations to it and then load it to database.', 'start': 1109.044, 'duration': 3.783}, {'end': 1117.429, 'text': 'So all these set of operations are performed inside a SSIS package.', 'start': 1113.867, 'duration': 3.562}, {'end': 1121.752, 'text': 'Now to import your data from a CSV file, first you have to create a CSV file.', 'start': 1117.93, 'duration': 3.822}, {'end': 1125.508, 'text': 'So let us go to Notepad and create a comma separated file.', 'start': 1122.406, 'duration': 3.102}], 'summary': 'Instructions for importing data from csv and performing transformations in ssis.', 'duration': 23.908, 'max_score': 1101.6, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1101600.jpg'}, {'end': 1232.674, 'src': 'heatmap', 'start': 1193.864, 'weight': 0.897, 'content': [{'end': 1194.364, 'text': 'This is done.', 'start': 1193.864, 'duration': 0.5}, {'end': 1203.726, 'text': 'So let me open my SQL server.', 'start': 1202.246, 'duration': 1.48}, {'end': 1209.808, 'text': 'So you can find in all programs, micro SQL server 2014, and here it is.', 'start': 1204.447, 'duration': 5.361}, {'end': 1215.564, 'text': "So first of all I'll create a new project.", 'start': 1213.683, 'duration': 1.881}, {'end': 1221.127, 'text': 'So here as you can see there are two methods to achieve this.', 'start': 1218.746, 'duration': 2.381}, {'end': 1227.171, 'text': 'So the first is a scratch template where you can perform your ETL and the second is using a wizard.', 'start': 1222.068, 'duration': 5.103}, {'end': 1229.472, 'text': "So we'll go to integration services project.", 'start': 1227.671, 'duration': 1.801}, {'end': 1232.674, 'text': "So I'll change the name to student information.", 'start': 1229.953, 'duration': 2.721}], 'summary': "Setting up sql server 2014, creating a new project with two methods for etl, and naming the project 'student information'.", 'duration': 38.81, 'max_score': 1193.864, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1193864.jpg'}, {'end': 1451.046, 'src': 'heatmap', 'start': 1408.253, 'weight': 0.729, 'content': [{'end': 1412.356, 'text': 'Now here I will go to my student ID and create this as primary key.', 'start': 1408.253, 'duration': 4.103}, {'end': 1415.278, 'text': 'Now I hope you guys are aware with the primary key concept.', 'start': 1412.996, 'duration': 2.282}, {'end': 1420.356, 'text': 'So once my table is created, now let me go back to my Visual Studio.', 'start': 1416.854, 'duration': 3.502}, {'end': 1423.558, 'text': "I've already discussed about control flow and data flow.", 'start': 1421.197, 'duration': 2.361}, {'end': 1427.401, 'text': 'Now here in my SSIS toolbox, there is my data flow task.', 'start': 1424.179, 'duration': 3.222}, {'end': 1431.203, 'text': "So I'll just drag and drop this data flow task and put a nice name to it.", 'start': 1427.481, 'duration': 3.722}, {'end': 1433.945, 'text': "So let's say load CSV file.", 'start': 1432.003, 'duration': 1.942}, {'end': 1440.701, 'text': 'So once this is done, when I double click on this component, it will automatically go to data flow.', 'start': 1435.239, 'duration': 5.462}, {'end': 1442.782, 'text': "So again, I'll show you.", 'start': 1441.342, 'duration': 1.44}, {'end': 1451.046, 'text': "If you can see, I'm here in control flow tab, where I drag and drop this data flow element, and when you double click on this,", 'start': 1443.683, 'duration': 7.363}], 'summary': 'Creating a primary key in ssis and navigating through visual studio components.', 'duration': 42.793, 'max_score': 1408.253, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1408253.jpg'}], 'start': 873.912, 'title': 'Ssis installation and package creation', 'summary': 'Covers obtaining and installing tools for ssis, including sql server options, and creating ssis packages for etl, including importing data, data transformation, and loading, with a demonstration of loading 500 rows of data into a database.', 'chapters': [{'end': 1018.606, 'start': 873.912, 'title': 'Tools for ssis installation', 'summary': 'Explains the process of obtaining and installing the necessary tools for ssis, including downloading sql server with options for free trial, developer edition, and express edition, and acquiring sql server data tools 17.1.', 'duration': 144.694, 'highlights': ['The chapter details the process of downloading and installing SQL Server, with options including free trial evaluation for 180 days, developer edition for development and test databases, and express edition for small server applications.', 'The chapter also explains the acquisition and installation of SQL Server data tools 17.1, emphasizing the importance of these tools for working with SQL Server 2014 as the actual server RDBMS.']}, {'end': 1719.259, 'start': 1018.986, 'title': 'Ssis package creation and etl process', 'summary': 'Explains the process of creating ssis packages for etl, including defining package structure, importing data from csv files, creating a database, and performing data transformation and loading, with a demonstration of loading 500 rows of data into a database.', 'duration': 700.273, 'highlights': ['SSIS package is fundamental for ETL, consisting of control flow and data flow elements, used for workflow handling and data transformations. The SSIS package is essential for ETL processes, consisting of control flow elements for workflow handling and data flow elements for data transformations.', 'Demonstration of importing 500 rows of data from a CSV file into a database using SSIS package. The demonstration involves importing 500 rows of data from a CSV file into a database using SSIS package, showcasing the practical application of ETL processes.', 'Process of creating a database and table, mapping data from a CSV file to the table, and successfully loading 500 rows of data into the database using SSIS package. The process involves creating a database and table, mapping data from a CSV file to the table, and successfully loading 500 rows of data into the database using SSIS package, demonstrating the end-to-end ETL process.']}], 'duration': 845.347, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw873912.jpg', 'highlights': ['The chapter details the process of downloading and installing SQL Server, with options including free trial evaluation for 180 days, developer edition for development and test databases, and express edition for small server applications.', 'The chapter also explains the acquisition and installation of SQL Server data tools 17.1, emphasizing the importance of these tools for working with SQL Server 2014 as the actual server RDBMS.', 'Process of creating a database and table, mapping data from a CSV file to the table, and successfully loading 500 rows of data into the database using SSIS package. The process involves creating a database and table, mapping data from a CSV file to the table, and successfully loading 500 rows of data into the database using SSIS package, demonstrating the end-to-end ETL process.', 'Demonstration of importing 500 rows of data from a CSV file into a database using SSIS package. The demonstration involves importing 500 rows of data from a CSV file into a database using SSIS package, showcasing the practical application of ETL processes.', 'SSIS package is fundamental for ETL, consisting of control flow and data flow elements, used for workflow handling and data transformations. The SSIS package is essential for ETL processes, consisting of control flow elements for workflow handling and data flow elements for data transformations.']}, {'end': 1978.299, 'segs': [{'end': 1800.271, 'src': 'embed', 'start': 1770.448, 'weight': 0, 'content': [{'end': 1772.849, 'text': 'Next I have other destination where I have loaded the data.', 'start': 1770.448, 'duration': 2.401}, {'end': 1775.05, 'text': 'And here comes the transformation part.', 'start': 1773.069, 'duration': 1.981}, {'end': 1778.772, 'text': 'Also you can see this common section where it has again transformation rules.', 'start': 1775.23, 'duration': 3.542}, {'end': 1780.953, 'text': "So I'll take a derived column here.", 'start': 1779.212, 'duration': 1.741}, {'end': 1786.957, 'text': 'So here my derived column is nothing but it will create an extra column in the existing column data.', 'start': 1782.514, 'duration': 4.443}, {'end': 1790.563, 'text': 'So here, first of all I have to make a connection.', 'start': 1787.901, 'duration': 2.662}, {'end': 1793.285, 'text': 'It says it makes a connection.', 'start': 1790.983, 'duration': 2.302}, {'end': 1800.271, 'text': "Now let's say closed.", 'start': 1799.25, 'duration': 1.021}], 'summary': 'Data loaded to another destination, transformation applied, derived column created.', 'duration': 29.823, 'max_score': 1770.448, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1770448.jpg'}, {'end': 1852.732, 'src': 'embed', 'start': 1821.725, 'weight': 2, 'content': [{'end': 1823.806, 'text': 'Either add as a new column or replace.', 'start': 1821.725, 'duration': 2.081}, {'end': 1824.966, 'text': "So I'll add a new column.", 'start': 1823.906, 'duration': 1.06}, {'end': 1828.148, 'text': 'Let me give a nice name to it.', 'start': 1826.707, 'duration': 1.441}, {'end': 1830.849, 'text': 'Say upper student name.', 'start': 1828.368, 'duration': 2.481}, {'end': 1836.897, 'text': 'Now if you see I have different functions available over here.', 'start': 1834.195, 'duration': 2.702}, {'end': 1841.922, 'text': 'So it has mathematical functions, string functions, date, null functions and many more.', 'start': 1837.378, 'duration': 4.544}, {'end': 1848.848, 'text': "So I'll directly go to the string functions and see if there is any method for uppercase.", 'start': 1842.462, 'duration': 6.386}, {'end': 1852.732, 'text': 'So here you find an upper function so let me just drag and drop to here.', 'start': 1849.609, 'duration': 3.123}], 'summary': "Adding a new column 'upper student name' using string functions, specifically the 'upper' function.", 'duration': 31.007, 'max_score': 1821.725, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1821725.jpg'}, {'end': 1933.132, 'src': 'embed', 'start': 1903.38, 'weight': 1, 'content': [{'end': 1907.303, 'text': 'So my extraction, transformation, and loading has been done successfully.', 'start': 1903.38, 'duration': 3.923}, {'end': 1913.568, 'text': "So let us go back in my management studio, and let's see what here it shows.", 'start': 1908.304, 'duration': 5.264}, {'end': 1918.593, 'text': "Select star on this, and I'll just execute this.", 'start': 1915.23, 'duration': 3.363}, {'end': 1923.357, 'text': 'So if you notice here, my studio name is in capital case.', 'start': 1919.413, 'duration': 3.944}, {'end': 1925.128, 'text': 'So there is it.', 'start': 1924.448, 'duration': 0.68}, {'end': 1928.57, 'text': 'So my extraction, transformation and loading has been done.', 'start': 1925.869, 'duration': 2.701}, {'end': 1933.132, 'text': "So I hope you guys are clear with all the three processes that I've just discussed.", 'start': 1929.851, 'duration': 3.281}], 'summary': 'Extraction, transformation, and loading completed successfully in management studio with capital case studio name.', 'duration': 29.752, 'max_score': 1903.38, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1903380.jpg'}], 'start': 1719.779, 'title': 'Sql table transformation and etl process', 'summary': 'Covers the transformation of a sql table, including deletion, setting primary key, and managing data sources, as well as an introduction to the etl process using ssis, involving creating derived columns, applying string functions, and executing successful extraction, transformation, and loading processes.', 'chapters': [{'end': 1770.047, 'start': 1719.779, 'title': 'Sql table transformation process', 'summary': 'Explains the process of transforming a table in a sql database, including deleting the table, setting primary key, and managing data sources.', 'duration': 50.268, 'highlights': ['The table is deleted to avoid primary key violations. The primary key violation is avoided by deleting the table.', 'The student names are transformed from lower case to upper case. Transformation of student names from lower case to upper case is demonstrated.', 'The extraction source from a flat file is mentioned as part of the process. The process includes mentioning the extraction source from a flat file.']}, {'end': 1978.299, 'start': 1770.448, 'title': 'Etl process and ssis overview', 'summary': 'Introduces the etl process, showcasing the transformation of data using ssis, including creating a derived column, applying string functions such as uppercase transformation, and executing successful extraction, transformation, and loading processes.', 'duration': 207.851, 'highlights': ['The chapter introduces the ETL process, showcasing the transformation of data using SSIS, including creating a derived column, applying string functions such as uppercase transformation, and executing successful extraction, transformation, and loading processes.', "The speaker demonstrates the creation of a derived column 'upper student name' using the SSIS tool, which involves applying the 'upper' string function to transform the 'student name' to uppercase, facilitating data manipulation within the ETL process.", 'The speaker emphasizes successful execution of the extraction, transformation, and loading process using SSIS, ensuring that the data is correctly transformed and loaded into the destination, ultimately leading to the display of the transformed data in the management studio.']}], 'duration': 258.52, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/0ikNnenDyNw/pics/0ikNnenDyNw1719779.jpg', 'highlights': ['The chapter introduces the ETL process, showcasing the transformation of data using SSIS, including creating a derived column, applying string functions such as uppercase transformation, and executing successful extraction, transformation, and loading processes.', 'The speaker emphasizes successful execution of the extraction, transformation, and loading process using SSIS, ensuring that the data is correctly transformed and loaded into the destination, ultimately leading to the display of the transformed data in the management studio.', "The speaker demonstrates the creation of a derived column 'upper student name' using the SSIS tool, which involves applying the 'upper' string function to transform the 'student name' to uppercase, facilitating data manipulation within the ETL process."]}], 'highlights': ['SSIS is cost-effective compared to other tools, making it a preferred choice for data integration.', 'SSIS has the capability to merge and integrate data from diverse sources, such as flat files, Excel, SAP, and Oracle.', 'The ETL process ensures relevant, accurate, high-quality data in the warehouse for efficient use by business users.', 'The ETL process is divided into four phases: capture, scrub, transformation, and load and index.', 'Process of creating a database and table, mapping data from a CSV file to the table, and successfully loading 500 rows of data into the database using SSIS package.', 'The chapter introduces the ETL process, showcasing the transformation of data using SSIS, including creating a derived column, applying string functions such as uppercase transformation, and executing successful extraction, transformation, and loading processes.']}