Supported editions for this feature: Frontline Standard; Enterprise Standard and Enterprise Plus; Education Standard and Education Plus; Enterprise Essentials Plus. Compare your edition
To export activity log events (single actions taken by a user) and usage reports (aggregate reports for an app) to Google BigQuery, you need to set up a BigQuery Export configuration in the Google Admin console.
About BigQuery and Reports API data
The data available in the BigQuery dataset differs from the data retrieved from the Reports API. The BigQuery data includes only the unfiltered dataset. You can still filter the data using SQL, but not all Reports API parameters are supported.
You can filter the Reports API data by including parameters in the API request.
Example: Two organizational units (OU) are in a domain, A and B. Using Reports API and BigQuery, you can access all the events for the entire domain (A and B ).
- However, with the Reports API you can retrieve the A events by using the orgUnitID parameter in the API request.
- With SQL and BigQuery, you cannot filter events by organizational unit because there isn’t a corresponding column to the orgUnitID parameter.
Important:
- The BigQuery data includes historical data, which you can also retrieve from the Reports API.
- If you turn off exporting Google Workspace data to BigQuery, no new data is included in the BigQuery Export. However, existing data is available in other sources, such as the Reports API.
- Not all service report data is available in BigQuery Export. For a list of supported services, go to What services does BigQuery Export support? later.
- For examples of queries, go to Example queries for reporting logs in BigQuery.
How data is propagated and retained
- Policies can take an hour to propagate. After that, daily tables are created in your dataset (Pacific Time).
- Data is saved following guidelines for other logs and reports. For details, go to Data retention and lag times.
- Data tables don’t automatically get deleted. To delete an active project, go to Delete a BigQuery Export configuration.
- BigQuery Exports collect Google Workspace data from the previous day's events. The result shows data from the previous day to the export date.
Set up a BigQuery Export configuration
You first need to set up a BigQuery project in the Google Cloud console. When you create the project, do the following:
- Add a Google Workspace administrator account as the project editor.
- Add the gapps-reports@system.gserviceaccount.com account as an editor. You need this to write logs, update the schema, and to complete Step 6 later.
For instructions, go to Set up a BigQuery project for reporting logs.
-
Sign in to your Google Admin console.
Sign in using an account with super administrator privileges (does not end in @gmail.com).
-
In the Admin console, go to Menu ReportingData integrations.
Education administrators go to Menu ReportingBigQuery export, which opens the Data integrations page.
- Point to the BigQuery Export card and click Edit .
- To activate BigQuery logs, check the Enable Google Workspace data export to Google BigQuery box.
-
(Optional) To export sensitive parameters of DLP rules, check the Allow export of sensitive content from DLP rule logs box. For details, go to View content that triggers DLP rules.
- Under BigQuery project ID, select the project where you want to store the logs. Choose a project with write access. If you don’t see the project, you need to set it up in BigQuery. For details, go to Quickstart using the Google Cloud console.
- Under New dataset within project, enter the name of the dataset to use for storing the logs in the project. Dataset names must be unique for each project. For details, go to Creating datasets.
- Click Save.
Note: If you can’t save the project, go to the Google Cloud console, delete the new dataset, then save it again in the Admin console.
When the export is triggered, the dataset is created the next day. In addition to project owners, editors, and viewers, the gapps-reports@system.gserviceaccount.com service account is added as editor. The service account is required to write logs and update the schema.
Log data export requirements
Logs data is exported through the insertAll API, which requires you to have billing enabled for your BigQuery export project. If billing is not enabled, your project will be in sandbox mode, and the log data isn’t exported to your dataset. For more details, go to Limitations.
Note: Usage report exports are still enabled for sandbox mode projects.
Lag times
In most cases, after you turn on data export to Google BigQuery, activity log events are available within 10 minutes. Usage log events have a delay of 48 hours on initial configuration, but afterwards the usual lag is 1-3 days. For details, go to Data retention and lag times.
FAQ
How do I set a data expiration for my exports?
Can I change a BigQuery project ID?
What services does BigQuery Export support?
- Accounts
- Admin
- Google Calendar
- Chrome
- Classroom
- DataStudio
- Devices
- Google Drive
- Gmail
- Google Chat
- Google Meet
- Google Meet Hardware
- Google Groups
- Login
- Rules
- SAML
- OAuth
- Accounts
- App Maker
- Google Apps Script
- Calendar
- Chrome OS
- Classroom
- Devices
- Google Docs
- Drive
- Gmail
- Google Search
- Meet
- Google Sites
- Google Voice
Note: We plan to support more log events, including Search.
Is there a cost to export log events to BiqQuery?
There is no cost to export usage reports, such as Devices or Meet reports.