I will describe the following for vRealize Log Insight Cloud
What is a Log Source ?
A log source is component (hardware or software) which generates log data. It could be either a physical (server, storage, or network) device in your private cloud or any software/tool running in private/public cloud.
Following table illustrates some of the log sources
Sr. No | Category | Log Source | Description |
1 | Native Public Cloud | AWS Service Logs | Using lambda function logs can be ingested from 40+ AWS log sources directly to vRealize Log Insight Cloud. For e.g. CloudFront logs can be sent to CloudTrail by making configuration changes, and then CloudTrail can dump all its logs to S3 buckets. Lambda function can directly trigger forwarding the logs from this S3 bucket to vRealize Log Insight Cloud. |
2 | Native Public Cloud | Azure Service Logs | Using Azure function logs can be ingested from all Azure services to vRealize Log Insight Cloud. For e.g., Azure Network Security Group (NSG) logs can be dumped to Blob Storage and Azure function can be configured to read from Blob Storage and send it to vRealize Log Insight Cloud. |
3 | Native Public Cloud | Google Cloud Service Logs | Almost all the GCP service logs can be routed to Google Pub/Sub. The GCP Pub/Sub can be configured to forward all logs directly to vRealize Log Insight Cloud. |
4 | DevOps Container Management | Cloud Native Application Logs in Kubernetes | Collect logs from Kubernetes PODs/Containers and forward to vRealize Log Insight Cloud. Open-source agents like fluentd/fluentbit can be used for the same. It needs to be deployed as daemon set |
5 | DevOps CI/CD Tool | Jenkins Application Logs | Collect Jenkins application logs and vRealize Log Insight Cloud. Open-source agents like fluentd/fluentbit can be used for the same.Sample Path for Linux based machines /var/log/jenkins/jenkins.log |
6 | DevOps Version Control Tool | GitHub Events | GithHub events can be published to vRealize Log Insight Cloud by configuring the webhook at the Organization or Repository level |
7 | Infrastructure Devices | Network Devices | Collect logs from network devices. Logs can be forwarded via Cloud Proxy appliance which has log-forwarder agent listening on syslog & CFAPI protocols |
8 | VMware Cloud | VMware SDDC Logs | Collect logs VMware SDDC components (vCenter, ESXi , NSXT) . Logs can be forwarded via Cloud Proxy appliance which has log-forwarder agent listening on syslog & CFAPI protocols |
What are supported Agents?
Logs can be ingested in to vRealize Log Insight Cloud using any of the supported log agent
Sr. No | Log Agent | Description |
1 | Cloud Proxy | Cloud Proxy is a Gateway device between Private Cloud and VMware Cloud Services. It is a Virtual Machine which can be configured to forward logs via syslog or CFAPI |
2 | Fluentd | Fluentd is an open-source data collector for unified logging layer. It collects all the logs from the configured sources, processes/transform it and then redirects them to vRealize Log Insight Cloud |
3 | Fluentbit | Fluent Bit is an open-source Log Processor and Forwarder which allows you to collect any data like metrics and logs from different sources, enrich them with filters and send them to multiple destinations like vRealize Log Insight Cloud. |
4 | Log Insight Agent | A vRealize Log Insight Agent collects events from log files and forwards them to a vRealize Log Insight Cloud server (via Cloud Proxy) or any third-party syslog destination. |
How to configure various log sources to send logs to vRealize Log Insight Cloud
In the following table I have shared some example configurations for some log sources
Sr. No | Category | Log Source | High Level Procedure |
1 | Native Public Cloud | AWS Service Logs | Clone my Github Repo. It includes terraform scripts to deploy lambda with triggers for S3 Bucket logs, S3 Event logs, CloudWatch logs. Update the terraform.tfvars for all 3 scripts Execute Terraform commands (init, plan, apply) Verify logs with filter “log_type starts with aws” For detailed walk through you can refer this blog |
2 | Native Public Cloud | Azure Service Logs | Deploy Azure function to collect logs from Azure Blob Storage Generate vRealize LogInsight Cloud API Key Deploy ARM Template to create Blob Storage by Navigating to Log Sources –> Azure –> Blob Storage –> Create a Blob Storage & Click Deploy to Azure (Provide Basic details such as Subscription, Resource Groups, API_URL, API_Token)Navigate to the Diagnostics Settings section under the monitoring tab of vRealize Log Insight Cloud Function & Configure Diagnostic Settings. Name should start with \”VMwareLogsFunction\”Navigate to the Functions section under the Functions tab of vRealize Log Insight Cloud Function & Click blobStorageFunction & Configure Blob Storage Trigger to fetch logs from Blob Storage. Configure any Azure service logs (for e.g., NSG) to be saved in Blob Storage created in Step2 Verify logs with filter “log_type contains azure_log” For details walk through you can refer this blog |
3 | DevOps Container Management | Cloud Native Application Logs in Kubernetes | Deploy fluentd as deamonset using helm chart to collect logs from Kubernetes Cluster Add Chart Repo helm repo add loginsight-cloud https://munishpalmakhija.github.io/loginsight-cloud-helm/ Get Values file in your working directory helm show values loginsight-cloud/loginsight-cloud-helm > values.yaml Update Values file with API Token and other relevant details Install Chart helm install <nameofrelease> loginsight-cloud/loginsight-cloud-helm -f values.yaml Verify Helm Release helm list Verify logs with filter “log_type contains kubernetes” For detailed walk through you can refer this blog |
4 | DevOps CI/CD Tool | Jenkins Application Logs | Deploy fluentd and collect logs from Jenkins application on Ubuntu. Install fluentd using deb package (td-agent) curl -L https://toolbelt.treasuredata.com/sh/install-ubuntu-trusty-td-agent3.sh | sh Install HTTP out plugin sudo /usr/sbin/td-agent-gem install fluent-plugin-out-http-ext Initiate td-agent sudo /etc/init.d/td-agent status Configure the td-agent.conf file located in the /etc/td-agent folder Update input, filter & ouput configuration. You can refer jenkins specific details here Restart td-agent sudo /etc/init.d/td-agent restart Verify logs with filter “log_type contains jenkins” For detailed walk through of similar procedure for Apache you can refer this blog |