{"id":2792,"date":"2019-10-23T12:50:30","date_gmt":"2019-10-23T04:50:30","guid":{"rendered":"https:\/\/www.mondoze.com\/guide\/?post_type=kb&p=2792"},"modified":"2022-10-05T08:01:38","modified_gmt":"2022-10-05T00:01:38","slug":"using-google-cloud-platform-to-analyze-cloudflare-logs","status":"publish","type":"kb","link":"https:\/\/www.mondoze.com\/guide\/kb\/using-google-cloud-platform-to-analyze-cloudflare-logs","title":{"rendered":"Using Google Cloud Platform to Analyze Cloudflare Logs"},"content":{"rendered":"\t\t
This tutorial covers how to configure certain Google Cloud Platform (GCP) components so that you can analyze your Cloudflare Logs data.\u00a0<\/p>
Before proceeding, you need to enable\u00a0Cloudflare Logpush in Google Cloud Storage\u00a0to ensure your log data is available for analyzing.<\/p>
The components we\u2019ll use in this tutorial include :<\/p>
The following diagram depicts how data flows from Cloudflare Logs through the different components of the Google Cloud Platform discussed in this tutorial.<\/p>
Google Cloud is offering a credit towards a new Google Cloud account to help you get start. To learn more, visit Google Cloud Platform Partner Credit.<\/p> After you configured Cloudflare Logpush to send your logs to a Google Cloud Storage bucket, your log data updates every five minutes by default.<\/p> Google BigQuery makes data available for both querying using Structured Query Language (SQL) and for configuring as a data source for the Google Data Studio reporting engine. BigQuery is a highly scalable cloud database where SQL queries run quite fast.<\/p> Importing data from Google Cloud Storage into Google BigQuery requires creating a function using Google Cloud Function and running it in the Google Cloud Shell. This function triggers every time new Cloudflare log data is upload to your Google Cloud Storage bucket.<\/p> To a create a cloud function to import data from Google Cloud Storage into Google BigQuery, you will need the following GitHub repository from Cloudflare:\u00a0https:\/\/github.com\/cloudflare\/GCS-To-Big-Query.<\/p> To clone and deploy the cloud function:<\/p> 1. \u00a0 \u00a0Run the Google Cloud Platform shell by opening the\u00a0Google Cloud Platform <\/strong>console and clicking the\u00a0Google Shell<\/strong>\u00a0icon (Activate Cloud Shell<\/em>).<\/p> 2. \u00a0 \u00a0Run the following command to download the\u00a0master<\/em>\u00a0zipped archive, uncompress the files to new a directory, and change the command line prompt to the new directory:<\/p> 3. \u00a0 \u00a0Next, edit the\u00a0 a.\u00a0 BUCKET_NAME<\/strong> is set to the bucket you create when you configure Cloudflare Logpush with Google Cloud Platform.<\/p> b.\u00a0 DATASET<\/strong>\u00a0and\u00a0TABLE<\/strong>\u00a0are unique names.<\/p> The contents of\u00a0 4. \u00a0 \u00a0Then in the\u00a0Google Shell<\/strong>, run the following command to deploy your instance of the cloud function:<\/p> Once you\u2019ve deployed your new cloud function, verify that it appears in the\u00a0Cloud Functions<\/strong>\u00a0interface by navigating to\u00a0Google Cloud Platform<\/strong>\u00a0>\u00a0Compute<\/strong>\u00a0>\u00a0Cloud Functions<\/strong>.<\/p>
<\/a><\/p>Task 1 – Use Google Cloud Function to import log data into Google BigQuery<\/strong><\/h3>
Clone and deploy a Google Cloud Function<\/strong><\/h4>
\u00a0 curl -LO \"https:\/\/github.com\/cloudflare\/GCS-To-Big-Query\/archive\/master.zip\" && unzip master.zip && cd GCS-To-Big-Query-master\u00a0\u00a0<\/code><\/p>deploy.sh<\/code>\u00a0file and make sure that:<\/p>deploy.sh<\/code>\u00a0should look similar to this:<\/p> . . . BUCKET_NAME=\"my_cloudflarelogs_gcp_storage_bucket\" DATASET=\"my_cloudflare_logs\" TABLE=\"cloudflare_logs\" . . . <\/code><\/pre> sh .\/deploy.sh <\/code><\/pre>