title
Splunk in 60 Minutes | Splunk Tutorial For Beginners | Splunk Training | Splunk Tutorial | Edureka

description
***** Splunk Training: https://www.edureka.co/splunk-certification-training ***** This Edureka Live session on Splunk will help you understand the fundamentals for Splunk with a demo on Log Collection & Analysis. Below are the topics that will be discussed in this session: 1. Why Splunk? – Problems With Log Data 2. What Is Splunk? – Ultimate Soln. For Log Processing 3. How Does It Work? – Splunk Components 4. Hands-On:- Log Collection & Analysis For more information, please write back to us at sales@edureka.co or call us at IND: 9606058406 / US: 18338555775 (toll free).

detail
{'title': 'Splunk in 60 Minutes | Splunk Tutorial For Beginners | Splunk Training | Splunk Tutorial | Edureka', 'heatmap': [{'end': 1266.13, 'start': 1133.636, 'weight': 0.902}], 'summary': "This splunk tutorial covers the importance of log analysis, real-time log forwarding, and significant cost savings achieved by companies like new york air brake, us's railroad organization, domino's pizza, and ing bank. it also includes demonstrations on managing search cluster components, installing and searching in splunk, setting up data forwarding and monitoring, and universal forwarder data flow with a focus on data filtering and analysis.", 'chapters': [{'end': 145.165, 'segs': [{'end': 109.814, 'src': 'embed', 'start': 40.935, 'weight': 0, 'content': [{'end': 43.637, 'text': 'And these will be the topics that I will cover.', 'start': 40.935, 'duration': 2.702}, {'end': 50.722, 'text': 'So first I will talk about the problems with log data and tell you why is there a need for a tool like Splunk right?', 'start': 44.057, 'duration': 6.665}, {'end': 55.806, 'text': "And after that I'll tell you what exactly Splunk is and how it is the ultimate solution for log processing.", 'start': 50.742, 'duration': 5.064}, {'end': 61.47, 'text': "And then I'll talk about a few of the Splunk components that are involved and I'll tell you how Splunk works.", 'start': 57.047, 'duration': 4.423}, {'end': 62.531, 'text': 'And finally,', 'start': 62.05, 'duration': 0.481}, {'end': 71.926, 'text': 'I will give you a demonstration of collecting logs from a remote instance and then putting them on to your Splunk instance and then performing analysis and visualization on that data.', 'start': 62.531, 'duration': 9.395}, {'end': 80.432, 'text': "Right?. So I hope that has set the agenda for today and, without wasting any time, I'm going to go to the first topic.", 'start': 72.847, 'duration': 7.585}, {'end': 82.814, 'text': 'and that is why is it?', 'start': 80.432, 'duration': 2.382}, {'end': 83.594, 'text': 'why is there a need?', 'start': 82.814, 'duration': 0.78}, {'end': 87.096, 'text': 'for you know, why is there a need to explore logs?', 'start': 83.594, 'duration': 3.502}, {'end': 96.503, 'text': "There's a famous saying that logs are the go to archives for gaining company wide operational intelligence.", 'start': 91.059, 'duration': 5.444}, {'end': 100.266, 'text': 'Now, does anybody disagree with this?', 'start': 97.884, 'duration': 2.382}, {'end': 109.814, 'text': 'You can please put in your comment box, and if there are people who agree with this, then I would also request you to tell us why you think so.', 'start': 101.507, 'duration': 8.307}], 'summary': 'The transcript covers the importance of log data, the need for splunk, its components, and a demonstration of log processing and visualization.', 'duration': 68.879, 'max_score': 40.935, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s40935.jpg'}], 'start': 4.778, 'title': 'Understanding splunk for log analysis', 'summary': 'Covers the importance of log analysis, the key components of splunk, and includes a demonstration on collecting and analyzing log data, highlighting the relevance of log analysis and splunk for businesses, particularly those operating online.', 'chapters': [{'end': 145.165, 'start': 4.778, 'title': 'Understanding splunk for log analysis', 'summary': 'Covers the importance of log analysis, the key components of splunk, and includes a demonstration on collecting and analyzing log data, highlighting the relevance of log analysis and splunk for businesses, particularly those operating online.', 'duration': 140.387, 'highlights': ['Logs are the go-to archives for gaining company wide operational intelligence, essential for businesses operating online like Edureka. Highlighting the importance of log analysis for gaining operational intelligence and its relevance for online businesses like Edureka.', 'Explanation of the problems with log data and the need for a tool like Splunk. Addressing the issues with log data and emphasizing the necessity for a tool like Splunk to tackle these problems effectively.', 'Demonstration of collecting logs from a remote instance, putting them onto the Splunk instance, and performing analysis and visualization on the data. Illustrating the practical application of Splunk by demonstrating the process of collecting, analyzing, and visualizing log data from a remote instance.']}], 'duration': 140.387, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s4778.jpg', 'highlights': ['Logs are the go-to archives for gaining company wide operational intelligence, essential for businesses operating online like Edureka.', 'Explanation of the problems with log data and the need for a tool like Splunk.', 'Demonstration of collecting logs from a remote instance, putting them onto the Splunk instance, and performing analysis and visualization on the data.']}, {'end': 911.46, 'segs': [{'end': 246.8, 'src': 'embed', 'start': 217.236, 'weight': 0, 'content': [{'end': 222.317, 'text': "Now, what will be the operations and transactions that I'm talking about? They are these.", 'start': 217.236, 'duration': 5.081}, {'end': 232.938, 'text': 'In case of your server, you will have details about what is your background, of your customers, what is the IP address of those people,', 'start': 223.536, 'duration': 9.402}, {'end': 236.018, 'text': 'which is the location or the geography from where they are accessing it?', 'start': 232.938, 'duration': 3.08}, {'end': 237.599, 'text': 'You have those details.', 'start': 236.758, 'duration': 0.841}, {'end': 241.739, 'text': 'And then you will have details with respect to there being any security threats.', 'start': 238.039, 'duration': 3.7}, {'end': 246.8, 'text': 'Are there any vulnerabilities to your servers, to your network??', 'start': 242.48, 'duration': 4.32}], 'summary': 'Operations and transactions include customer background, ip addresses, location, and security threats.', 'duration': 29.564, 'max_score': 217.236, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s217236.jpg'}, {'end': 441.714, 'src': 'embed', 'start': 410.875, 'weight': 1, 'content': [{'end': 411.235, 'text': 'Can you?', 'start': 410.875, 'duration': 0.36}, {'end': 412.996, 'text': 'Definitely not right?', 'start': 411.915, 'duration': 1.081}, {'end': 414.056, 'text': "It's an impossible task.", 'start': 413.076, 'duration': 0.98}, {'end': 417.498, 'text': 'And such logs, they get generated massively.', 'start': 414.697, 'duration': 2.801}, {'end': 422.201, 'text': 'Almost thousands of such lines of codes get generated every single minute in fact.', 'start': 417.958, 'duration': 4.243}, {'end': 426.263, 'text': 'and what happens is every single transaction.', 'start': 423.021, 'duration': 3.242}, {'end': 432.968, 'text': "let's take, for example, the web server of any e-commerce company, or you can even take example of edureka here.", 'start': 426.263, 'duration': 6.705}, {'end': 439.032, 'text': "so every single user that is trying to access edureka's website or something like amazon.com or flipkart.com,", 'start': 432.968, 'duration': 6.064}, {'end': 441.714, 'text': 'their ip address will be recorded along with that.', 'start': 439.032, 'duration': 2.682}], 'summary': 'Massive generation of logs, thousands of lines per minute, recording user ip addresses.', 'duration': 30.839, 'max_score': 410.875, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s410875.jpg'}, {'end': 527.698, 'src': 'embed', 'start': 498.441, 'weight': 2, 'content': [{'end': 500.342, 'text': 'Splunk does real-time log forwarding.', 'start': 498.441, 'duration': 1.901}, {'end': 502.123, 'text': 'Now, when we say real-time log forwarding,', 'start': 500.582, 'duration': 1.541}, {'end': 509.526, 'text': 'it means that it can collect logs from one particular instance or one particular server and forward those logs to a remote instance.', 'start': 502.123, 'duration': 7.403}, {'end': 513.234, 'text': "okay, so you can see that there's a system here.", 'start': 510.413, 'duration': 2.821}, {'end': 518.794, 'text': 'right, you can assume this as a server of probably a web server or a router, and from here whatever logs are collected,', 'start': 513.234, 'duration': 5.56}, {'end': 522.155, 'text': 'they can be sent to the remote location.', 'start': 518.794, 'duration': 3.361}, {'end': 527.698, 'text': "they can be sent to the search at, where the end users will be actually looking at the data and they'll be able to get insights.", 'start': 522.155, 'duration': 5.543}], 'summary': 'Splunk enables real-time log forwarding from one server to a remote instance, providing end users with insights.', 'duration': 29.257, 'max_score': 498.441, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s498441.jpg'}, {'end': 584.854, 'src': 'embed', 'start': 561.17, 'weight': 3, 'content': [{'end': 568.401, 'text': 'But here It can be installed on any system of yours and it can monitor any application based on whatever system.', 'start': 561.17, 'duration': 7.231}, {'end': 571.361, 'text': 'logs are being generated in real time and you can perform analysis on that.', 'start': 568.401, 'duration': 2.96}, {'end': 578.623, 'text': 'And similarly, you can install Splunk on any servers and you can perform monitoring and understand what is the IP traffic.', 'start': 572.242, 'duration': 6.381}, {'end': 580.544, 'text': 'how many people are there on your website?', 'start': 578.623, 'duration': 1.921}, {'end': 583.373, 'text': 'what are they trying to perform??', 'start': 581.812, 'duration': 1.561}, {'end': 584.854, 'text': 'What actions are they trying to perform??', 'start': 583.673, 'duration': 1.181}], 'summary': 'Splunk can monitor any application, generate real-time logs, and analyze ip traffic to understand website visitors and their actions.', 'duration': 23.684, 'max_score': 561.17, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s561170.jpg'}, {'end': 675.872, 'src': 'embed', 'start': 638.676, 'weight': 4, 'content': [{'end': 639.576, 'text': "right. so that's one more thing.", 'start': 638.676, 'duration': 0.9}, {'end': 649.099, 'text': 'So another application of Splunk is that it can give you real time alerts and notifications.', 'start': 641.892, 'duration': 7.207}, {'end': 659.569, 'text': "So these alerts or notifications would come in handy when there's an actual security threat or probably when something wrong is about to happen or something strange is happening to your servers.", 'start': 649.419, 'duration': 10.15}, {'end': 666.698, 'text': 'If someone is accessing your network from from a very unreliable source,', 'start': 660.249, 'duration': 6.449}, {'end': 675.872, 'text': "then immediately you can configure Splunk such that it would throw you an alert when it realizes that there's a request coming in from that particular IP range.", 'start': 666.698, 'duration': 9.174}], 'summary': 'Splunk can provide real-time alerts and notifications for security threats and unusual server activity, such as unauthorized network access from unreliable sources.', 'duration': 37.196, 'max_score': 638.676, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s638676.jpg'}, {'end': 787.513, 'src': 'embed', 'start': 764.036, 'weight': 5, 'content': [{'end': 772.579, 'text': 'so you can make use of all these historical data lock store and you can gain a lot of business insights and a lot of advantages.', 'start': 764.036, 'duration': 8.543}, {'end': 779.468, 'text': 'And I might also want to tell you at this point of time that when Splunk stores data, it stores data in a compressed form.', 'start': 773.604, 'duration': 5.864}, {'end': 783.711, 'text': "So it's not like whatever data comes in, it's going to be stored straight away.", 'start': 779.648, 'duration': 4.063}, {'end': 787.513, 'text': "No, it's going to be compressed at one level to one point of time.", 'start': 784.071, 'duration': 3.442}], 'summary': 'Splunk stores historical data in compressed form, providing business insights and advantages.', 'duration': 23.477, 'max_score': 764.036, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s764036.jpg'}], 'start': 145.652, 'title': 'Importance of exploring logs for operational intelligence', 'summary': "Emphasizes the valuable information contained in logs, the challenges of deciphering them, and the potential impact on business operations, especially in the era of internet dominance. it also explains how splunk enables real-time log forwarding, syslog analysis, real-time alerts, and historical data storage, benefiting companies like vodafone, domino's, and ing bank by managing big data and optimizing customer offers based on insights from logs.", 'chapters': [{'end': 456.231, 'start': 145.652, 'title': 'Importance of exploring logs for operational intelligence', 'summary': 'Emphasizes the importance of exploring logs for operational intelligence, detailing the valuable information contained in logs, the challenges of deciphering them, and the potential impact on business operations, especially in the era of internet dominance.', 'duration': 310.579, 'highlights': ['Logs contain crucial details about server operations, including customer background, IP addresses, geographical locations, and security threats, providing essential insights for business operations. Details about customer background, IP addresses, geographical locations, and security threats are stored in logs.', 'Logs also encompass critical system performance information such as CPU usage, load, user access, application usage, and system responsiveness, enabling informed decision-making for system optimization. Logs store data related to CPU usage, load, user access, application usage, and system responsiveness.', 'Massive generation of logs, with thousands of lines of codes produced every single minute, underlines the challenge of extracting valuable insights from the sheer volume of data, emphasizing the need for efficient log exploration techniques. Thousands of lines of codes are generated every single minute, posing a significant challenge in log analysis.']}, {'end': 911.46, 'start': 456.731, 'title': 'Splunk: real-time log analysis tool', 'summary': "Explains how splunk enables real-time log forwarding, syslog analysis, real-time alerts, and historical data storage, benefiting companies like vodafone, domino's, and ing bank by managing big data and optimizing customer offers based on insights from logs.", 'duration': 454.729, 'highlights': ['Splunk enables real-time log forwarding, allowing collection and forwarding of logs from one server to a remote location, providing human-readable insights for end users. This feature allows real-time monitoring and analysis of logs, aiding in quick decision-making and problem resolution.', 'Splunk provides real-time syslog analysis, allowing monitoring and analysis of logs generated in real time from any application or server, enabling monitoring of IP traffic and customer behavior. This feature helps in understanding customer behavior, targeting marketing strategies, and identifying security threats in real time.', 'Splunk offers real-time alerts and notifications, enabling immediate detection of security threats and system performance issues, thus preventing potential losses. This feature helps in proactively identifying and addressing security threats and system performance issues to prevent potential losses.', 'Splunk allows historical data storage and analysis, compressing and storing logs for a set period, providing insights and advantages from stored data. This feature enables businesses to gain insights and advantages from stored historical data, aiding in decision-making and business optimization.']}], 'duration': 765.808, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s145652.jpg', 'highlights': ['Logs contain crucial details about server operations, including customer background, IP addresses, geographical locations, and security threats, providing essential insights for business operations.', 'Massive generation of logs, with thousands of lines of codes produced every single minute, underlines the challenge of extracting valuable insights from the sheer volume of data, emphasizing the need for efficient log exploration techniques.', 'Splunk enables real-time log forwarding, allowing collection and forwarding of logs from one server to a remote location, providing human-readable insights for end users.', 'Splunk provides real-time syslog analysis, allowing monitoring and analysis of logs generated in real time from any application or server, enabling monitoring of IP traffic and customer behavior.', 'Splunk offers real-time alerts and notifications, enabling immediate detection of security threats and system performance issues, thus preventing potential losses.', 'Splunk allows historical data storage and analysis, compressing and storing logs for a set period, providing insights and advantages from stored data.']}, {'end': 1349.756, 'segs': [{'end': 990.142, 'src': 'embed', 'start': 911.46, 'weight': 0, 'content': [{'end': 920.232, 'text': "but yeah, that's how waterphone is using splunk And this New York air brake, this is one other.", 'start': 911.46, 'duration': 8.772}, {'end': 920.812, 'text': 'excuse me guys.', 'start': 920.232, 'duration': 0.58}, {'end': 931.477, 'text': "So New York air brake, it's another company which used Splunk and they saved around $1 billion in just one month by implementing Splunk.", 'start': 921.172, 'duration': 10.305}, {'end': 942.383, 'text': 'Well, what did they do? They used Splunk to monitor the logs of how the brakes, you know, the logs of brakes being applied in the trains.', 'start': 932.318, 'duration': 10.065}, {'end': 947.161, 'text': 'Correct?. So we have the New York Railroad right?', 'start': 943.744, 'duration': 3.417}, {'end': 954.505, 'text': 'They are part of that organization and whatever trains are and whenever brakes are applied, those logs were recorded.', 'start': 947.922, 'duration': 6.583}, {'end': 963.673, 'text': 'and, basically, based on whatever data they collected,', 'start': 960.672, 'duration': 3.001}, {'end': 973.796, 'text': 'they analyzed that even braking for a few seconds during the journey of a train resulted in consuming a lot of fuel later,', 'start': 963.673, 'duration': 10.123}, {'end': 976.617, 'text': 'when the brakes were released and acceleration was applied.', 'start': 973.796, 'duration': 2.821}, {'end': 986.341, 'text': 'So by just applying brakes and releasing them and increasing the speed by hitting on the gas pedal, they realized that a lot of fuel was consumed,', 'start': 978.418, 'duration': 7.923}, {'end': 990.142, 'text': 'and this they got it through Splunk by analyzing the brake logs.', 'start': 986.341, 'duration': 3.801}], 'summary': 'New york air brake saved $1 billion in a month using splunk to analyze train brake logs.', 'duration': 78.682, 'max_score': 911.46, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s911460.jpg'}, {'end': 1036.035, 'src': 'embed', 'start': 1012.11, 'weight': 2, 'content': [{'end': 1019.314, 'text': "So if you want to read more about Domino's, then you can definitely go to one of my blogs on Edureka's page and you can view it there.", 'start': 1012.11, 'duration': 7.204}, {'end': 1019.975, 'text': 'Okay, but.', 'start': 1019.614, 'duration': 0.361}, {'end': 1026.829, 'text': "what Domino's did was they understood the behavioral patterns of their consumers, whoever was buying their products.", 'start': 1021.007, 'duration': 5.822}, {'end': 1034.374, 'text': 'They understood at what time, on what dates, their customers prefer to buy edible items.', 'start': 1027.671, 'duration': 6.703}, {'end': 1036.035, 'text': 'And based on those details,', 'start': 1034.434, 'duration': 1.601}], 'summary': "Domino's analyzed consumer behavior to optimize product sales.", 'duration': 23.925, 'max_score': 1012.11, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1012110.jpg'}, {'end': 1083.124, 'src': 'embed', 'start': 1056.432, 'weight': 3, 'content': [{'end': 1059.753, 'text': "a lot of banks have implemented Splunk, but they don't really reveal it.", 'start': 1056.432, 'duration': 3.321}, {'end': 1063.154, 'text': 'And ING Bank have been pretty brave in revealing that.', 'start': 1060.253, 'duration': 2.901}, {'end': 1069.477, 'text': "And they've said that they use Splunk for faster troubleshooting of key application and get insight into customer behavior.", 'start': 1063.675, 'duration': 5.802}, {'end': 1077.861, 'text': "Well, it's not just these things, not just troubleshooting certain applications or getting inside into certain customer behavior.", 'start': 1070.216, 'duration': 7.645}, {'end': 1080.983, 'text': "It's also about monitoring the bank accounts.", 'start': 1078.201, 'duration': 2.782}, {'end': 1083.124, 'text': 'and are there any fraudulent transactions happening?', 'start': 1080.983, 'duration': 2.141}], 'summary': 'Ing bank openly uses splunk for faster troubleshooting and monitoring bank accounts for fraudulent transactions.', 'duration': 26.692, 'max_score': 1056.432, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1056432.jpg'}, {'end': 1267.731, 'src': 'heatmap', 'start': 1117.051, 'weight': 5, 'content': [{'end': 1124.993, 'text': 'And similarly, there are a lot of other domains where Splunk is being used, and Splunk is one such tool which is used almost in every single business.', 'start': 1117.051, 'duration': 7.942}, {'end': 1129.295, 'text': 'Because almost every business today are on the internet.', 'start': 1125.934, 'duration': 3.361}, {'end': 1133.336, 'text': "They have an internet presence, right? So that's where Splunk really benefits.", 'start': 1129.355, 'duration': 3.981}, {'end': 1137.517, 'text': "And that's where even companies using Splunk, they also get a real big benefit out of it.", 'start': 1133.636, 'duration': 3.881}, {'end': 1141.915, 'text': 'So that was about their customers and use cases.', 'start': 1138.772, 'duration': 3.143}, {'end': 1146.68, 'text': 'Let me go forward and actually talk about how Splunk works.', 'start': 1142.776, 'duration': 3.904}, {'end': 1148.982, 'text': "Let's talk about a few of the Splunk components now.", 'start': 1146.92, 'duration': 2.062}, {'end': 1159.931, 'text': 'Now, the most important components, or the primary components in Splunk are those of forwarders, indexers, searches.', 'start': 1152.406, 'duration': 7.525}, {'end': 1164.695, 'text': 'okay, in a single instance deployment you have these three components which you will be dealing with,', 'start': 1159.931, 'duration': 4.764}, {'end': 1172.762, 'text': 'but when you go to a more distributed deployment of Splunk or or use of Splunk in a very vast manner,', 'start': 1164.695, 'duration': 8.067}, {'end': 1175.044, 'text': 'at that time you will have multiple forwarders in place.', 'start': 1172.762, 'duration': 2.282}, {'end': 1177.486, 'text': 'you have multiple indexers again,', 'start': 1175.044, 'duration': 2.442}, {'end': 1183.912, 'text': 'multiple search heads and along with this you will have a few more components which would manage these basic components,', 'start': 1177.486, 'duration': 6.426}, {'end': 1188.94, 'text': 'and they are nothing but deployer, cluster master and deployment server.', 'start': 1183.912, 'duration': 5.028}, {'end': 1192.923, 'text': 'okay, let me explain each of these components one after the other.', 'start': 1188.94, 'duration': 3.983}, {'end': 1194.504, 'text': 'what do forwarders do?', 'start': 1192.923, 'duration': 1.581}, {'end': 1202.751, 'text': 'forwarders are basically responsible for collecting your data and forwarding it to another splunk instance, and in this case it would be the indexer.', 'start': 1194.504, 'duration': 8.247}, {'end': 1204.693, 'text': 'the indexer is where the data is being stored.', 'start': 1202.751, 'duration': 1.942}, {'end': 1210.162, 'text': 'whatever logs come in in real time or maybe not real time that will all be stored in the indexer.', 'start': 1204.693, 'duration': 5.469}, {'end': 1214.068, 'text': "But when it's inside the indexer, you can't just straight away look at this data.", 'start': 1210.463, 'duration': 3.605}, {'end': 1219.575, 'text': 'We got to use the search head, which will in turn access the data that is present in the indexer.', 'start': 1214.308, 'duration': 5.267}, {'end': 1226.772, 'text': 'And it will let you do the analysis or visualization and reporting, give you alerts, notification, all those things.', 'start': 1219.775, 'duration': 6.997}, {'end': 1228.273, 'text': 'So that happens at the search head.', 'start': 1226.812, 'duration': 1.461}, {'end': 1234.695, 'text': 'So we interact with the search head, the search head accesses the data from the indexer and the indexer gets the data from the forwarders.', 'start': 1228.293, 'duration': 6.402}, {'end': 1241.697, 'text': 'Okay, If these three, if these three components do this much, then what about the other three components that you see on the screen?', 'start': 1235.195, 'duration': 6.502}, {'end': 1242.337, 'text': 'What did they do??', 'start': 1241.857, 'duration': 0.48}, {'end': 1243.537, 'text': 'You can have that question.', 'start': 1242.737, 'duration': 0.8}, {'end': 1246.958, 'text': "Now that's where the concept of a distributed deployment comes.", 'start': 1244.077, 'duration': 2.881}, {'end': 1253.52, 'text': 'When you have a massive infrastructure, when you have a lot of data coming in, then one indexer, one folder and one search head is not enough.', 'start': 1247.238, 'duration': 6.282}, {'end': 1260.087, 'text': 'because your data needs to be safe, correct? You cannot have data loss or you cannot have downtime.', 'start': 1254.403, 'duration': 5.684}, {'end': 1266.13, 'text': 'So for that purpose, you have other components like deployer cluster master and deployment server in place.', 'start': 1260.107, 'duration': 6.023}, {'end': 1267.731, 'text': 'What does a deployer do?', 'start': 1266.731, 'duration': 1}], 'summary': 'Splunk is widely used in businesses for data analysis and visualization, with components like forwarders, indexers, and searches.', 'duration': 150.68, 'max_score': 1117.051, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1117051.jpg'}, {'end': 1188.94, 'src': 'embed', 'start': 1164.695, 'weight': 7, 'content': [{'end': 1172.762, 'text': 'but when you go to a more distributed deployment of Splunk or or use of Splunk in a very vast manner,', 'start': 1164.695, 'duration': 8.067}, {'end': 1175.044, 'text': 'at that time you will have multiple forwarders in place.', 'start': 1172.762, 'duration': 2.282}, {'end': 1177.486, 'text': 'you have multiple indexers again,', 'start': 1175.044, 'duration': 2.442}, {'end': 1183.912, 'text': 'multiple search heads and along with this you will have a few more components which would manage these basic components,', 'start': 1177.486, 'duration': 6.426}, {'end': 1188.94, 'text': 'and they are nothing but deployer, cluster master and deployment server.', 'start': 1183.912, 'duration': 5.028}], 'summary': 'Distributed splunk deployment involves multiple forwarders, indexers, search heads, and additional management components like deployer, cluster master, and deployment server.', 'duration': 24.245, 'max_score': 1164.695, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1164695.jpg'}], 'start': 911.46, 'title': 'Using splunk for significant cost savings', 'summary': "Showcases how new york air brake saved $1 billion in a month by using splunk to analyze brake logs, while other businesses like us's railroad organization, domino's pizza, and ing bank also benefited from using splunk, resulting in cost savings and valuable customer insights.", 'chapters': [{'end': 990.142, 'start': 911.46, 'title': 'Splunk saves new york air brake $1 billion', 'summary': 'Highlights how new york air brake saved $1 billion in just one month by using splunk to monitor brake logs, which revealed that even braking for a few seconds during the journey of a train resulted in consuming a lot of fuel later, and this insight was obtained through splunk analysis.', 'duration': 78.682, 'highlights': ['New York air brake saved around $1 billion in just one month by implementing Splunk to monitor brake logs.', 'Analyzing the brake logs revealed that even braking for a few seconds during the journey of a train resulted in consuming a lot of fuel later.', 'The insight about fuel consumption was obtained through Splunk analysis.']}, {'end': 1349.756, 'start': 991.519, 'title': 'Splunk components and use cases', 'summary': "Highlights how companies like us's railroad organization, domino's pizza, and ing bank saved significant amounts of money and accessed vital customer behavior insights through the use of splunk, a tool that is widely utilized in various businesses and internet presence.", 'duration': 358.237, 'highlights': ["US's railroad organization saved $1 billion in just one month through the use of Splunk. The organization saved $1 billion in just one month with the help of Splunk.", "Domino's Pizza utilized customer behavior insights to offer targeted marketing and close deals faster. Domino's Pizza used customer behavior insights to enhance targeted marketing and accelerate deal closures.", 'ING Bank revealed its use of Splunk for faster troubleshooting and insight into customer behavior, particularly in monitoring bank accounts for fraudulent transactions. ING Bank showcased its use of Splunk for rapid troubleshooting, customer behavior insight, and monitoring bank accounts for fraudulent activities.', 'Splunk is widely utilized in various businesses due to the internet presence of almost every business, providing significant benefits. Splunk is extensively used in numerous businesses due to their online presence, resulting in substantial advantages.', "The primary components of Splunk include forwarders, indexers, and searches in a single instance deployment, while a distributed deployment involves multiple instances of these components along with deployer, cluster master, and deployment server. Splunk's primary components include forwarders, indexers, and searches in a single instance deployment, and in a distributed deployment, there are multiple instances of these components along with deployer, cluster master, and deployment server."]}], 'duration': 438.296, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s911460.jpg', 'highlights': ['New York air brake saved around $1 billion in just one month by implementing Splunk to monitor brake logs.', "US's railroad organization saved $1 billion in just one month through the use of Splunk.", "Domino's Pizza utilized customer behavior insights to offer targeted marketing and close deals faster.", 'ING Bank revealed its use of Splunk for faster troubleshooting and insight into customer behavior, particularly in monitoring bank accounts for fraudulent transactions.', 'Analyzing the brake logs revealed that even braking for a few seconds during the journey of a train resulted in consuming a lot of fuel later.', 'Splunk is widely utilized in various businesses due to the internet presence of almost every business, providing significant benefits.', 'The insight about fuel consumption was obtained through Splunk analysis.', 'The primary components of Splunk include forwarders, indexers, and searches in a single instance deployment, while a distributed deployment involves multiple instances of these components along with deployer, cluster master, and deployment server.']}, {'end': 1638.976, 'segs': [{'end': 1405.039, 'src': 'embed', 'start': 1349.756, 'weight': 0, 'content': [{'end': 1358.603, 'text': "and your cluster master ensures that all these nodes are up and running at all the time and there's replication happening and your spark instance is behaving as per your expectations.", 'start': 1349.756, 'duration': 8.847}, {'end': 1366.088, 'text': 'And, besides that, it also manages your different search heads and it tells your search heads where to look for data.', 'start': 1359.403, 'duration': 6.685}, {'end': 1372.732, 'text': 'Supposing a search head wants to search for data in the indexer which is again stored in a remote location.', 'start': 1366.448, 'duration': 6.284}, {'end': 1375.875, 'text': 'which IP to go to, which indexer to access and get data?', 'start': 1372.732, 'duration': 3.143}, {'end': 1382.639, 'text': 'So all these answers, the cluster master would give to the captain over here in the search cluster.', 'start': 1375.995, 'duration': 6.644}, {'end': 1388.566, 'text': 'Okay And then you have the deployment server, which is similar to deployer.', 'start': 1383.2, 'duration': 5.366}, {'end': 1397.233, 'text': 'The deployer would issue configuration updates to the cluster members, right? But the deployment server would do a similar job to the forwarders.', 'start': 1388.786, 'duration': 8.447}, {'end': 1405.039, 'text': "If there are, if there's any change in the kind of data you want to pull from the source and send to the destination, then those updates,", 'start': 1397.913, 'duration': 7.126}], 'summary': 'Cluster master manages nodes, replication, and spark instance; deployment server issues configuration updates to cluster members and forwarders.', 'duration': 55.283, 'max_score': 1349.756, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1349756.jpg'}, {'end': 1518.131, 'src': 'embed', 'start': 1446.986, 'weight': 3, 'content': [{'end': 1450.088, 'text': 'So here I explained the forwarder, the indexer and the searcher.', 'start': 1446.986, 'duration': 3.102}, {'end': 1457.831, 'text': 'So what I will do in my demonstration is that I will collect data from my forwarders and send it to another instance which is nothing but my indexer.', 'start': 1450.148, 'duration': 7.683}, {'end': 1463.517, 'text': "Now, since I will be using a single instance deployment, then it's just going to be one indexer and one search engine.", 'start': 1458.652, 'duration': 4.865}, {'end': 1467.841, 'text': 'So I can do both those things from the instance where the data is coming to.', 'start': 1463.557, 'duration': 4.284}, {'end': 1470.103, 'text': "Okay So that's what I am going to do.", 'start': 1468.301, 'duration': 1.802}, {'end': 1475.148, 'text': 'And this is called of course, log collection with the help of a universal forwarder.', 'start': 1471.084, 'duration': 4.064}, {'end': 1482.028, 'text': 'So, Let me get started with my demonstration.', 'start': 1478.351, 'duration': 3.677}, {'end': 1486.849, 'text': "And for my demonstration, I'm going to use AWS instances where I have installed Splunk.", 'start': 1482.308, 'duration': 4.541}, {'end': 1494.311, 'text': 'Okay Over there, I will configure the syslogs to go from one instance to another instance.', 'start': 1487.149, 'duration': 7.162}, {'end': 1497.792, 'text': "And I'll show you how Splunk will, how Splunk plays a major role there.", 'start': 1494.331, 'duration': 3.461}, {'end': 1507.775, 'text': "Right? So I'm going to, first of all, open up the AWS instances and guys do give me a confirmation if you can see my screen here.", 'start': 1498.812, 'duration': 8.963}, {'end': 1515.57, 'text': 'okay, i suppose my screen is visible here.', 'start': 1511.488, 'duration': 4.082}, {'end': 1518.131, 'text': 'so this is my aws instance, where i have a couple of.', 'start': 1515.57, 'duration': 2.561}], 'summary': 'Demonstration of log collection with splunk using aws instances and configuring syslogs for data transfer to indexer and search engine.', 'duration': 71.145, 'max_score': 1446.986, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1446986.jpg'}], 'start': 1349.756, 'title': 'Managing search cluster components and splunk log collection demo', 'summary': 'Covers managing cluster components, including cluster master and deployment server responsibilities, and demonstrates log collection using splunk with data forwarding from a universal forwarder to an indexer, forming a distributed deployment.', 'chapters': [{'end': 1422.377, 'start': 1349.756, 'title': 'Managing search cluster components', 'summary': "Explains the role of cluster master in managing nodes, replication, and spark instance, along with the deployment server's responsibility of issuing configuration and application updates to cluster members and forwarders.", 'duration': 72.621, 'highlights': ['The cluster master ensures all nodes are up and running and manages replication and spark instance behavior.', 'It directs search heads on where to look for data and provides necessary information such as IP and indexer access.', 'The deployment server issues configuration updates to cluster members and performs similar role to forwarders for data and application updates.']}, {'end': 1638.976, 'start': 1422.477, 'title': 'Splunk log collection demo', 'summary': 'Introduces the concept of log collection using splunk, demonstrated through the setup of data forwarding from a universal forwarder to an indexer on aws instances, forming a part of a distributed deployment with multiple components.', 'duration': 216.499, 'highlights': ['The demonstration showcases the setup of log collection using a universal forwarder to forward data to an indexer on AWS instances, forming a part of a distributed deployment with multiple components. The setup involves forwarding data from a universal forwarder to an indexer on AWS instances as part of a distributed deployment.', 'The speaker plans to demonstrate the use of a single instance deployment with one indexer and one search engine for collecting and analyzing system logs from the universal forwarder. The demonstration will involve the use of a single instance deployment with one indexer and one search engine for collecting and analyzing system logs from the universal forwarder.', 'The speaker will utilize AWS instances to configure the forwarding of syslogs from one instance to another and illustrate the role of Splunk in the process. AWS instances will be used to configure the forwarding of syslogs from one instance to another, demonstrating the role of Splunk in the process.']}], 'duration': 289.22, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1349756.jpg', 'highlights': ['The cluster master ensures all nodes are up and running and manages replication and spark instance behavior.', 'The deployment server issues configuration updates to cluster members and performs similar role to forwarders for data and application updates.', 'It directs search heads on where to look for data and provides necessary information such as IP and indexer access.', 'The demonstration showcases the setup of log collection using a universal forwarder to forward data to an indexer on AWS instances, forming a part of a distributed deployment with multiple components.', 'The speaker plans to demonstrate the use of a single instance deployment with one indexer and one search engine for collecting and analyzing system logs from the universal forwarder.', 'The speaker will utilize AWS instances to configure the forwarding of syslogs from one instance to another and illustrate the role of Splunk in the process.']}, {'end': 2266.331, 'segs': [{'end': 1699.524, 'src': 'embed', 'start': 1665.351, 'weight': 5, 'content': [{'end': 1667.052, 'text': "I'll just show it to you end to end from the beginning.", 'start': 1665.351, 'duration': 1.701}, {'end': 1674.776, 'text': 'But remember that whenever you have to run Splunk through the CLI, then you have to always be logged in through a root user.', 'start': 1668.132, 'duration': 6.644}, {'end': 1683.342, 'text': 'So always give this command sudo su and then start performing your operations or commands, right? You need root privilege for that.', 'start': 1674.856, 'duration': 8.486}, {'end': 1690.341, 'text': "Okay, So anyways, moving forward, let's go to the opt directory,", 'start': 1684.242, 'duration': 6.099}, {'end': 1699.524, 'text': 'where and where we need to install any add-on applications to your server or to your system, right on linux.', 'start': 1690.341, 'duration': 9.183}], 'summary': "To run splunk through the cli, use 'sudo su' to log in as root user and perform operations; install add-on applications in the opt directory on linux.", 'duration': 34.173, 'max_score': 1665.351, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1665351.jpg'}, {'end': 1753.107, 'src': 'embed', 'start': 1715.671, 'weight': 0, 'content': [{'end': 1725.195, 'text': "Okay So I've removed the Splunk and this is just the tar file, right? So similarly, let me go to my universal forwarder.", 'start': 1715.671, 'duration': 9.524}, {'end': 1732.918, 'text': "Okay I'm going to do a CD and move to the OPD directory.", 'start': 1730.197, 'duration': 2.721}, {'end': 1737.76, 'text': 'And over here, there is the Splunk forwarder already installed.', 'start': 1733.978, 'duration': 3.782}, {'end': 1738.88, 'text': "So I'm going to remove this.", 'start': 1737.88, 'duration': 1}, {'end': 1743.882, 'text': 'Okay Great.', 'start': 1739.42, 'duration': 4.462}, {'end': 1753.107, 'text': "So now I'm going to install both these, the universal folder and my Splunk enterprise instance from scratch.", 'start': 1745.563, 'duration': 7.544}], 'summary': 'Removing splunk and installing universal forwarder and splunk enterprise instance from scratch.', 'duration': 37.436, 'max_score': 1715.671, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1715671.jpg'}, {'end': 1948.795, 'src': 'embed', 'start': 1887.23, 'weight': 1, 'content': [{'end': 1893.153, 'text': "But since I'm starting Splunk for the first time after downloading and installing it,", 'start': 1887.23, 'duration': 5.923}, {'end': 1902.205, 'text': "you have to remember you have to give this flag of accept license okay, well, even if you don't give it, it's fine.", 'start': 1893.153, 'duration': 9.052}, {'end': 1908.868, 'text': "but it's just that when you don't say, when you don't give accept license, you'll be prompted to uh you know,", 'start': 1902.205, 'duration': 6.663}, {'end': 1914.491, 'text': 'keep scrolling down until you agree to the complete document.', 'start': 1908.868, 'duration': 5.623}, {'end': 1918.313, 'text': "so it's just a shorter way to, uh you know, get your splunk instance up and running.", 'start': 1914.491, 'duration': 3.822}, {'end': 1923.242, 'text': 'So it says my web server is up at this IP address.', 'start': 1920.14, 'duration': 3.102}, {'end': 1925.803, 'text': 'So let me go to my Splunk enterprise instance.', 'start': 1923.342, 'duration': 2.461}, {'end': 1930.986, 'text': 'Okay So that was my cluster master, right? So this is the public IP.', 'start': 1927.184, 'duration': 3.802}, {'end': 1933.107, 'text': 'I can access it through my public IP.', 'start': 1931.226, 'duration': 1.881}, {'end': 1937.449, 'text': "Okay I'm going to just paste the public IP here and port number 8, 000.", 'start': 1933.407, 'duration': 4.042}, {'end': 1948.795, 'text': 'Okay So there we go.', 'start': 1937.449, 'duration': 11.346}], 'summary': 'Splunk setup completed with acceptance of license flag, accessing web server at specified ip.', 'duration': 61.565, 'max_score': 1887.23, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1887230.jpg'}, {'end': 2216.367, 'src': 'embed', 'start': 2191.312, 'weight': 3, 'content': [{'end': 2196.996, 'text': "whatever logs that I need to send, I can send it by just running command through my CLI, and that's what I'm going to do.", 'start': 2191.312, 'duration': 5.684}, {'end': 2203.341, 'text': 'but what, which data am I going to send to my splunk indexer that let me first figure out.', 'start': 2196.996, 'duration': 6.345}, {'end': 2205.563, 'text': "let's see where my system logs are stored.", 'start': 2203.341, 'duration': 2.222}, {'end': 2208.685, 'text': 'so this particular server system logs.', 'start': 2205.563, 'duration': 3.122}, {'end': 2212.988, 'text': "since it's a Linux machine, it would always be stored at this particular folder.", 'start': 2208.685, 'duration': 4.303}, {'end': 2216.367, 'text': 'right inside my var folder there is lib.', 'start': 2212.988, 'duration': 3.379}], 'summary': 'Data sent to splunk indexer by running command through cli, accessing system logs stored in var/lib folder.', 'duration': 25.055, 'max_score': 2191.312, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2191312.jpg'}, {'end': 2275.357, 'src': 'embed', 'start': 2245.336, 'weight': 4, 'content': [{'end': 2252.403, 'text': 'This data is unstructured and this is definitely not in readable format, and you have this data that will keep getting generated every few seconds.', 'start': 2245.336, 'duration': 7.067}, {'end': 2255.206, 'text': "And there's simply a lot of data.", 'start': 2253.344, 'duration': 1.862}, {'end': 2258.249, 'text': 'Now, is it readable? Definitely not.', 'start': 2255.566, 'duration': 2.683}, {'end': 2260.907, 'text': 'correct. so what is the solution?', 'start': 2259.226, 'duration': 1.681}, {'end': 2266.331, 'text': 'if i want to read this and get insights out of this log data, send it to the.', 'start': 2260.907, 'duration': 5.424}, {'end': 2275.357, 'text': 'send it to my splunk instance, to my enterprise instance, which is going to act as my indexer, and search it and then perform analysis over there,', 'start': 2266.331, 'duration': 9.026}], 'summary': 'Unstructured data is continuously generated, and the solution is to send it to splunk for indexing and analysis.', 'duration': 30.021, 'max_score': 2245.336, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2245336.jpg'}], 'start': 1638.976, 'title': 'Installing and searching in splunk', 'summary': 'Provides a comprehensive guide on installing splunk enterprise and universal forwarder, emphasizing permissions. it also details the process of searching and forwarding data, including setting up the splunk instance and accessing system logs.', 'chapters': [{'end': 1996.355, 'start': 1638.976, 'title': 'Installing splunk enterprise', 'summary': 'Covers the process of installing splunk enterprise and splunk universal forwarder, including setting permissions and starting the instances, with emphasis on using root user for cli access and granting read, write, execute access to the folders.', 'duration': 357.379, 'highlights': ["The importance of using root user for CLI access and granting read, write, execute access to the folders It is emphasized that whenever running Splunk through the CLI, it's necessary to be logged in through a root user. Additionally, it's highlighted that setting the right permissions for the folders is crucial, requiring giving the read, write, execute access to the folders using the command 'chmod 777' for the Splunk forwarder and 'chmod -r' for the Splunk enterprise instance.", 'Process of installing Splunk Enterprise and universal forwarder from scratch The process of removing the existing Splunk instances, extracting the tar files, and setting up the Splunk enterprise and universal forwarder instances from scratch is described, demonstrating the steps involved in installing the software.', "Starting the Splunk instances and accessing the Splunk enterprise instance through a web server The steps to start the Splunk instances, including using the command '. /Splunk start' and accepting the license, as well as accessing the Splunk enterprise instance through a web server using the public IP and port number 8000, are outlined."]}, {'end': 2266.331, 'start': 1996.375, 'title': 'Splunk data search and forwarding', 'summary': 'Details the process of searching and forwarding data in splunk, including setting up the splunk instance, accepting the license, and accessing and sending system logs to the splunk indexer.', 'duration': 269.956, 'highlights': ['Setting up the Splunk instance involves starting the Splunk server daemon and accepting the license, which allows immediate startup of the Splunk instance. Starting Splunk server daemon, accepting the license, immediate startup of Splunk instance', 'Accessing and sending system logs to the Splunk indexer involves locating the system logs, which are typically stored in the var/log folder in a Linux machine, and then using the CLI to send the logs to the Splunk indexer for analysis. Locating system logs, using CLI to send logs to Splunk indexer', 'The unstructured and unreadable format of the system logs necessitates the need for a solution to make the data readable and gain insights from it for analysis. Unstructured and unreadable system logs, need for a solution to make data readable and gain insights']}], 'duration': 627.355, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s1638976.jpg', 'highlights': ['The process of installing Splunk Enterprise and universal forwarder from scratch', 'Starting the Splunk instances and accessing the Splunk enterprise instance through a web server', 'Setting up the Splunk instance involves starting the Splunk server daemon and accepting the license, which allows immediate startup of the Splunk instance', 'Accessing and sending system logs to the Splunk indexer involves locating the system logs, which are typically stored in the var/log folder in a Linux machine, and then using the CLI to send the logs to the Splunk indexer for analysis', 'The unstructured and unreadable format of the system logs necessitates the need for a solution to make the data readable and gain insights from it for analysis', "The importance of using root user for CLI access and granting read, write, execute access to the folders It is emphasized that whenever running Splunk through the CLI, it's necessary to be logged in through a root user"]}, {'end': 2719.924, 'segs': [{'end': 2380.398, 'src': 'embed', 'start': 2295.427, 'weight': 0, 'content': [{'end': 2310.409, 'text': "Okay, here I'm going to run the command dot slash Splunk, add forward server, forward hyphen server,", 'start': 2295.427, 'duration': 14.982}, {'end': 2315.992, 'text': 'and then i have to give the ip address of the instance to which i want to send this data right.', 'start': 2310.409, 'duration': 5.583}, {'end': 2320.974, 'text': 'so the ip address of this particular instance i can find it over here.', 'start': 2315.992, 'duration': 4.982}, {'end': 2322.375, 'text': 'so this is the cluster master correct.', 'start': 2320.974, 'duration': 1.401}, {'end': 2330.898, 'text': "so this is the ip address i'm going to copy this, go back to my instance and when i paste it here, along with the ip address,", 'start': 2322.375, 'duration': 8.523}, {'end': 2335.805, 'text': 'i have to specify the port number where that Splunk instance would listen to data.', 'start': 2330.898, 'duration': 4.907}, {'end': 2339.646, 'text': 'There are a number of ports which will be open in Splunk.', 'start': 2336.426, 'duration': 3.22}, {'end': 2346.707, 'text': 'And for receiving data, the default port number is 9997.', 'start': 2340.966, 'duration': 5.741}, {'end': 2348.228, 'text': 'So I need to specify that over here.', 'start': 2346.707, 'duration': 1.521}, {'end': 2359.229, 'text': "So by specifying this, I'm basically telling this particular forwarder of mine to add a particular server to where I want to forward data.", 'start': 2350.468, 'duration': 8.761}, {'end': 2369.332, 'text': 'And the server address is this one and send to port number 9997 of this particular server address.', 'start': 2360.73, 'duration': 8.602}, {'end': 2373.454, 'text': 'So let me hit enter and it says your session is invalid.', 'start': 2369.892, 'duration': 3.562}, {'end': 2375.335, 'text': 'Please log in.', 'start': 2373.894, 'duration': 1.441}, {'end': 2380.398, 'text': 'The username is admin and the password would be change me.', 'start': 2375.355, 'duration': 5.043}], 'summary': 'Setting up data forwarding to server ip address and port 9997 in splunk.', 'duration': 84.971, 'max_score': 2295.427, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2295427.jpg'}, {'end': 2586.922, 'src': 'embed', 'start': 2560.697, 'weight': 4, 'content': [{'end': 2564.758, 'text': 'So if I want to accept data, then I have to open the port number triple nine seven.', 'start': 2560.697, 'duration': 4.061}, {'end': 2575.04, 'text': 'And to do that, the command is Splunk, enable, listen, and followed by the port number.', 'start': 2564.858, 'duration': 10.182}, {'end': 2576.92, 'text': 'That is nothing but triple nine seven.', 'start': 2575.18, 'duration': 1.74}, {'end': 2580.341, 'text': 'Okay I need to log in here also.', 'start': 2578.84, 'duration': 1.501}, {'end': 2581.621, 'text': 'Let me say admin.', 'start': 2580.821, 'duration': 0.8}, {'end': 2586.922, 'text': 'And this time it is a different password because I changed my password at first login.', 'start': 2582.401, 'duration': 4.521}], 'summary': "To accept data, open port 9997 using 'splunk enable listen' command.", 'duration': 26.225, 'max_score': 2560.697, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2560697.jpg'}, {'end': 2657.84, 'src': 'embed', 'start': 2634.402, 'weight': 3, 'content': [{'end': 2641.405, 'text': "so along with that, it's also saying that i have got four thousand and twenty events indexed.", 'start': 2634.402, 'duration': 7.003}, {'end': 2644.026, 'text': "okay, and less than a day ago that's because i sent it.", 'start': 2641.405, 'duration': 2.621}, {'end': 2652.056, 'text': 'uh, now, in real time, correct, the latest event that was generated, what was 17 minutes ago, correct?', 'start': 2644.026, 'duration': 8.03}, {'end': 2654.678, 'text': 'so let me start searching for data.', 'start': 2652.056, 'duration': 2.622}, {'end': 2657.84, 'text': 'if you remember my data, I was storing it in index main.', 'start': 2654.678, 'duration': 3.162}], 'summary': 'Over 4,020 events indexed, latest event generated 17 minutes ago.', 'duration': 23.438, 'max_score': 2634.402, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2634402.jpg'}], 'start': 2266.331, 'title': 'Setting up data forwarding and monitoring in splunk', 'summary': 'Covers setting up data forwarding, including sending data to the enterprise instance, specifying the file to forward, and setting up data monitoring with 4020 events indexed and the latest event generated 17 minutes ago.', 'chapters': [{'end': 2531.612, 'start': 2266.331, 'title': 'Setting up data forwarding in splunk', 'summary': 'Explains the process of setting up data forwarding in splunk, starting with sending data to the enterprise instance acting as the indexer, setting up the forward server with the ip address and port number, and specifying the file to forward, including the index and source type.', 'duration': 265.281, 'highlights': ['The chapter outlines the process of setting up data forwarding in Splunk, including sending data to the enterprise instance acting as the indexer, setting up the forward server with the IP address and port number, and specifying the file to forward, including the index and source type.', 'The default port number for receiving data in Splunk is 9997, which needs to be specified when setting up data forwarding.', 'The process involves adding a forward server with the IP address and sending data to the specified port number, in this case, 9997, for the data forwarding pipeline to be ready.', "The chapter also emphasizes the importance of specifying the file to forward, along with the index and source type for the data, in this case, mentioning 'index main' and the source type as 'universal forwarder server logs.'", "Before forwarding the data, the user needs to log in with the default username 'admin' and password 'change me', which can be changed after the first login."]}, {'end': 2719.924, 'start': 2531.992, 'title': 'Setting up splunk data monitoring', 'summary': 'Outlines the process of setting up splunk data monitoring, including opening the port number 9997 to accept data, and provides details on how to search and view indexed data, with a total of 4020 events indexed and the latest event generated 17 minutes ago.', 'duration': 187.932, 'highlights': ["The command to open the port number 9997 for listening to data is 'Splunk enable listen 9997', enabling the acceptance of data.", 'The indexed data consists of 4020 events, with the latest event generated 17 minutes ago, providing quantifiable data on the volume and recency of the indexed events.', "The process of searching for data in the 'main' index yields the logs forwarded from the forwarder instance, with the source type specified as universal forwarder server logs, and the corresponding private IP address confirmed as 172.30.0.123, demonstrating the successful indexing and forwarding of data."]}], 'duration': 453.593, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2266331.jpg', 'highlights': ['The process involves adding a forward server with the IP address and sending data to the specified port number, in this case, 9997, for the data forwarding pipeline to be ready.', 'The default port number for receiving data in Splunk is 9997, which needs to be specified when setting up data forwarding.', 'The chapter outlines the process of setting up data forwarding in Splunk, including sending data to the enterprise instance acting as the indexer, setting up the forward server with the IP address and port number, and specifying the file to forward, including the index and source type.', 'The indexed data consists of 4020 events, with the latest event generated 17 minutes ago, providing quantifiable data on the volume and recency of the indexed events.', "The command to open the port number 9997 for listening to data is 'Splunk enable listen 9997', enabling the acceptance of data."]}, {'end': 3239.943, 'segs': [{'end': 2773.385, 'src': 'embed', 'start': 2719.924, 'weight': 0, 'content': [{'end': 2722.726, 'text': 'oops, uh, sorry, i think i explained it wrong.', 'start': 2719.924, 'duration': 2.802}, {'end': 2724.267, 'text': 'what am i saying?', 'start': 2722.726, 'duration': 1.541}, {'end': 2726.729, 'text': 'the data is coming in from a universal forward, correct.', 'start': 2724.267, 'duration': 2.462}, {'end': 2731.583, 'text': 'so my universal folder ip address would be this 172.30..', 'start': 2726.729, 'duration': 4.854}, {'end': 2738.98, 'text': '0.123, and that is the same that you see in my instance of indexer.', 'start': 2731.583, 'duration': 7.397}, {'end': 2741.762, 'text': 'ip address is the same where slash log.', 'start': 2738.98, 'duration': 2.782}, {'end': 2745.304, 'text': 'slash syslog is the source from where my data is coming in.', 'start': 2741.762, 'duration': 3.542}, {'end': 2747.405, 'text': 'this is the host and this is the source type.', 'start': 2745.304, 'duration': 2.101}, {'end': 2751.168, 'text': 'is my universal forwarder server logs?', 'start': 2747.405, 'duration': 3.763}, {'end': 2755.591, 'text': 'okay, perfect, now, whatever, uh, events are coming in.', 'start': 2751.168, 'duration': 4.423}, {'end': 2758.442, 'text': 'So in your logs you will have events right?', 'start': 2756.161, 'duration': 2.281}, {'end': 2763.303, 'text': "So you'll have multiple events coming in and those events would be different.", 'start': 2759.162, 'duration': 4.141}, {'end': 2766.583, 'text': 'They would have different values for different transactions.', 'start': 2764.123, 'duration': 2.46}, {'end': 2768.824, 'text': 'So the difference would be only that.', 'start': 2766.944, 'duration': 1.88}, {'end': 2773.385, 'text': 'And that is what is going to come in here as different events.', 'start': 2770.584, 'duration': 2.801}], 'summary': 'Data is received from a universal forwarder with ip address 172.30.0.123, and various events with different values are processed.', 'duration': 53.461, 'max_score': 2719.924, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2719924.jpg'}, {'end': 2948.908, 'src': 'embed', 'start': 2921.83, 'weight': 2, 'content': [{'end': 2926.651, 'text': 'but if you want to filter this data even more, then you can click on this date.', 'start': 2921.83, 'duration': 4.821}, {'end': 2936.553, 'text': "second okay, so here it shows that on the 24th second of any minute we've got these many events coming in, and on the 25th second and so on,", 'start': 2926.651, 'duration': 9.902}, {'end': 2940.922, 'text': 'these are the different seconds on which, uh, we have multiple events common.', 'start': 2936.553, 'duration': 4.369}, {'end': 2948.908, 'text': 'so on the 24th event, if I filter it for that time, then you can see that they all have a similar time frame.', 'start': 2940.922, 'duration': 7.986}], 'summary': 'Data can be filtered by time, showing events per second.', 'duration': 27.078, 'max_score': 2921.83, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2921830.jpg'}, {'end': 2985.446, 'src': 'embed', 'start': 2963.259, 'weight': 3, 'content': [{'end': 2971.5, 'text': 'okay, but however, In my next live session I will give you a more detailed explanation of the Splunk search and reporting commands.', 'start': 2963.259, 'duration': 8.241}, {'end': 2973.981, 'text': 'These are some of the search and reporting commands.', 'start': 2971.96, 'duration': 2.021}, {'end': 2976.442, 'text': 'These are just a few of them.', 'start': 2975.441, 'duration': 1.001}, {'end': 2983.245, 'text': "I will give you a detailed explanation of a number of other commands in my next live session and I'll teach you more.", 'start': 2976.642, 'duration': 6.603}, {'end': 2985.446, 'text': 'Hopefully, we can learn a lot from each other.', 'start': 2983.685, 'duration': 1.761}], 'summary': 'Next live session will cover detailed explanation of splunk search and reporting commands.', 'duration': 22.187, 'max_score': 2963.259, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2963259.jpg'}, {'end': 3188.389, 'src': 'embed', 'start': 3156.79, 'weight': 4, 'content': [{'end': 3162.066, 'text': 'you stand a chance to win to avail an exclusive discount Right.', 'start': 3156.79, 'duration': 5.276}, {'end': 3168.895, 'text': 'I would request you to fill in this form and someone from my team would be sending you the link to this on the chat box.', 'start': 3162.086, 'duration': 6.809}, {'end': 3171.538, 'text': 'So pick it up from there and fill this out.', 'start': 3168.935, 'duration': 2.603}, {'end': 3179.465, 'text': 'And you can also of course mention if you want to be notified of any other live sessions and on which topics you want to be notified of.', 'start': 3172.441, 'duration': 7.024}, {'end': 3180.905, 'text': "So you'll get those details.", 'start': 3179.645, 'duration': 1.26}, {'end': 3188.389, 'text': 'And for those of you who are serious about becoming a Splunk administrator and power user, then we have Edrica.', 'start': 3181.626, 'duration': 6.763}], 'summary': 'Fill out the form to avail exclusive discount and receive details about live sessions and splunk administrator training.', 'duration': 31.599, 'max_score': 3156.79, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s3156790.jpg'}], 'start': 2719.924, 'title': 'Universal forwarder data flow and splunk data filter and analysis', 'summary': 'Discusses the flow of data from a universal forwarder to an indexer, emphasizing the importance of ip addresses and the concept of events with varying values. it also covers the demonstration of filtering and analyzing data in splunk, including insights on event frequency and time zone considerations, and upcoming sessions on splunk search and reporting commands, knowledge objects, and distributed cluster setup.', 'chapters': [{'end': 2773.385, 'start': 2719.924, 'title': 'Universal forwarder data flow', 'summary': 'Discusses the flow of data from a universal forwarder to an indexer, emphasizing the importance of ip addresses and the concept of events with varying values.', 'duration': 53.461, 'highlights': ['The data is coming in from a universal forwarder with the IP address 172.30.0.123, which is also the same IP address seen in the instance of the indexer.', 'The source of the data coming in is from /log/syslog, which serves as the host and the source type.', 'Events coming in will have different values for different transactions, representing different events.']}, {'end': 3239.943, 'start': 2773.505, 'title': 'Splunk data filter and analysis', 'summary': 'Covers the demonstration of filtering and analyzing data in splunk, including filtering by host, source, date, and minute, with insights on event frequency and time zone considerations, and concludes with a preview of upcoming sessions on splunk search and reporting commands, knowledge objects, and distributed cluster setup.', 'duration': 466.438, 'highlights': ['The demonstration includes filtering data by host, source, date, and minute, providing insights on event frequency and time zone considerations. The speaker demonstrates filtering data by host, source, date, and minute, providing insights on event frequency and time zone considerations, such as different time zones due to server location.', 'Preview of upcoming sessions on Splunk search and reporting commands, knowledge objects, and distributed cluster setup. The speaker previews upcoming sessions on Splunk search and reporting commands, knowledge objects, and distributed cluster setup, offering to teach more and interact with the audience.', "Encourages audience to fill a survey form for a chance to win an exclusive discount, and promotes Edureka's Splunk course and blog on a Splunk use case. The speaker encourages the audience to fill a survey form for a chance to win an exclusive discount, and promotes Edureka's Splunk course and a blog on a Splunk use case at Domino's."]}], 'duration': 520.019, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/6lX4DOd1T-s/pics/6lX4DOd1T-s2719924.jpg', 'highlights': ['The data is coming in from a universal forwarder with the IP address 172.30.0.123, which is also the same IP address seen in the instance of the indexer.', 'Events coming in will have different values for different transactions, representing different events.', 'The demonstration includes filtering data by host, source, date, and minute, providing insights on event frequency and time zone considerations.', 'Preview of upcoming sessions on Splunk search and reporting commands, knowledge objects, and distributed cluster setup.', "Encourages audience to fill a survey form for a chance to win an exclusive discount, and promotes Edureka's Splunk course and blog on a Splunk use case."]}], 'highlights': ['New York air brake saved around $1 billion in just one month by implementing Splunk to monitor brake logs.', "US's railroad organization saved $1 billion in just one month through the use of Splunk.", "Domino's Pizza utilized customer behavior insights to offer targeted marketing and close deals faster.", 'ING Bank revealed its use of Splunk for faster troubleshooting and insight into customer behavior, particularly in monitoring bank accounts for fraudulent transactions.', 'Logs contain crucial details about server operations, including customer background, IP addresses, geographical locations, and security threats, providing essential insights for business operations.', 'Logs are the go-to archives for gaining company wide operational intelligence, essential for businesses operating online like Edureka.', 'Explanation of the problems with log data and the need for a tool like Splunk.', 'Demonstration of collecting logs from a remote instance, putting them onto the Splunk instance, and performing analysis and visualization on the data.', 'Massive generation of logs, with thousands of lines of codes produced every single minute, underlines the challenge of extracting valuable insights from the sheer volume of data, emphasizing the need for efficient log exploration techniques.', 'Splunk enables real-time log forwarding, allowing collection and forwarding of logs from one server to a remote location, providing human-readable insights for end users.', 'Splunk provides real-time syslog analysis, allowing monitoring and analysis of logs generated in real time from any application or server, enabling monitoring of IP traffic and customer behavior.', 'Splunk offers real-time alerts and notifications, enabling immediate detection of security threats and system performance issues, thus preventing potential losses.', 'Splunk allows historical data storage and analysis, compressing and storing logs for a set period, providing insights and advantages from stored data.', 'The process involves adding a forward server with the IP address and sending data to the specified port number, in this case, 9997, for the data forwarding pipeline to be ready.', 'The default port number for receiving data in Splunk is 9997, which needs to be specified when setting up data forwarding.', 'The chapter outlines the process of setting up data forwarding in Splunk, including sending data to the enterprise instance acting as the indexer, setting up the forward server with the IP address and port number, and specifying the file to forward, including the index and source type.', 'The indexed data consists of 4020 events, with the latest event generated 17 minutes ago, providing quantifiable data on the volume and recency of the indexed events.', "The command to open the port number 9997 for listening to data is 'Splunk enable listen 9997', enabling the acceptance of data."]}