Configure log sources to forward logs to vRealize Log Insight Cloud
By: Date: 22/12/2021 Categories: VMware Tags:

I will describe the following for vRealize Log Insight Cloud 

What is a Log Source ? 

A log source is component (hardware or software) which generates log data. It could be either a physical (server, storage, or network) device in your private cloud or any software/tool running in private/public cloud.  

Following table illustrates some of the log sources  

Sr. No Category  Log Source Description 
Native Public Cloud  AWS Service Logs  Using lambda function logs can be ingested from 40+ AWS log sources directly to vRealize Log Insight Cloud.  For e.g.  CloudFront logs can be sent to CloudTrail by making configuration changes, and then CloudTrail can dump all its logs to S3 buckets. Lambda function can directly trigger forwarding the logs from this S3 bucket to vRealize Log Insight Cloud. 
Native Public Cloud Azure Service Logs  Using Azure function logs can be ingested from all Azure services to vRealize Log Insight Cloud.  For e.g.,  Azure Network Security Group (NSG) logs can be dumped to Blob Storage and Azure function can be configured to read from Blob Storage and send it to vRealize Log Insight Cloud. 
Native Public Cloud Google Cloud Service Logs  Almost all the GCP service logs can be routed to Google Pub/Sub.  The GCP Pub/Sub can be configured to forward all logs directly to vRealize Log Insight Cloud. 
DevOps Container Management  Cloud Native Application Logs in Kubernetes  Collect logs from Kubernetes PODs/Containers and forward to vRealize Log Insight Cloud. Open-source agents like fluentd/fluentbit can be used for the same. It needs to be deployed as daemon set  
DevOps CI/CD Tool Jenkins Application Logs  Collect Jenkins application logs and vRealize Log Insight Cloud. Open-source agents like fluentd/fluentbit can be used for the same.Sample Path for Linux based machines /var/log/jenkins/jenkins.log 
DevOps Version Control Tool GitHub Events  GithHub events can be published to vRealize Log Insight Cloud by configuring the webhook at the Organization or Repository level 
Infrastructure Devices Network Devices Collect logs from network devices. Logs can be forwarded via Cloud Proxy appliance which has log-forwarder agent listening on syslog & CFAPI protocols 
VMware Cloud VMware SDDC Logs Collect logs VMware SDDC components (vCenter, ESXi , NSXT) . Logs can be forwarded via Cloud Proxy appliance which has log-forwarder agent listening on syslog & CFAPI protocols 

What are supported Agents?  

 Logs can be ingested in to vRealize Log Insight Cloud using any of the supported log agent  

Sr. No Log Agent Description 
Cloud Proxy  Cloud Proxy is a Gateway device between Private Cloud and VMware Cloud Services. It is a Virtual Machine which can be configured to forward logs via syslog or CFAPI 
Fluentd  Fluentd is an open-source data collector for unified logging layer. It collects all the logs from the configured sources, processes/transform it and then redirects them to vRealize Log Insight Cloud 
Fluentbit Fluent Bit is an open-source Log Processor and Forwarder which allows you to collect any data like metrics and logs from different sources, enrich them with filters and send them to multiple destinations like vRealize Log Insight Cloud. 
Log Insight Agent A vRealize Log Insight Agent collects events from log files and forwards them to a vRealize Log Insight Cloud server (via Cloud Proxy) or any third-party syslog destination. 

 

How to configure various log sources  to send logs to vRealize Log Insight Cloud

In the following table I  have shared some example configurations for some log sources

Sr. No Category  Log Source High Level Procedure 
Native Public Cloud  AWS Service Logs  Clone my Github Repo. It includes terraform scripts to deploy lambda with triggers for S3 Bucket logs, S3 Event logs, CloudWatch logs. Update the   terraform.tfvars for all 3 scripts Execute Terraform commands (init, plan, apply)  Verify logs with filter “log_type starts with aws” For detailed walk through you can refer this blog  
Native Public Cloud Azure Service Logs  Deploy Azure function to collect logs from Azure Blob Storage  Generate vRealize LogInsight Cloud API Key Deploy ARM Template to create Blob Storage by Navigating to Log Sources –> Azure –> Blob Storage –> Create a Blob Storage & Click Deploy to Azure
(Provide Basic details such as Subscription, Resource Groups, API_URL, API_Token)Navigate to the Diagnostics Settings section under the monitoring tab of vRealize Log Insight Cloud Function & Configure Diagnostic Settings.
Name should start with \”VMwareLogsFunction\”Navigate to the Functions section under the Functions tab of vRealize Log Insight Cloud Function & Click blobStorageFunction & Configure Blob Storage Trigger to fetch logs from Blob Storage. Configure any Azure service logs (for e.g., NSG) to be saved in Blob Storage created in Step2 Verify logs with filter “log_type contains azure_log” For details walk through you can refer this blog 
DevOps Container Management  Cloud Native Application Logs in Kubernetes  Deploy fluentd as deamonset using helm chart to collect logs from Kubernetes Cluster Add Chart Repo 
helm repo add loginsight-cloud https://munishpalmakhija.github.io/loginsight-cloud-helm/ Get Values file in your working directory 
helm show values loginsight-cloud/loginsight-cloud-helm  > values.yaml Update Values file with API Token and other relevant details Install Chart 
helm install <nameofrelease> loginsight-cloud/loginsight-cloud-helm -f values.yaml Verify Helm Release 
helm list Verify logs with filter “log_type contains kubernetes” For detailed walk through you can refer this blog  
DevOps CI/CD Tool Jenkins Application Logs  Deploy fluentd and collect logs from Jenkins application on Ubuntu. Install fluentd using deb package (td-agent) 
curl -L https://toolbelt.treasuredata.com/sh/install-ubuntu-trusty-td-agent3.sh | sh Install HTTP out plugin  
sudo /usr/sbin/td-agent-gem install fluent-plugin-out-http-ext Initiate td-agent  
sudo /etc/init.d/td-agent status Configure the td-agent.conf file located in the /etc/td-agent folder 
Update input, filter & ouput configuration. You can refer jenkins specific details here Restart td-agent
 sudo /etc/init.d/td-agent restart Verify logs with filter “log_type contains jenkins” For detailed walk through of similar procedure for Apache you can refer this blog