[GA4] About Data Import

Upload data from external sources and join it with your Analytics data

Why use Data Import?

Each business system you use generates its own data. Your CRM might contain information like customer-loyalty ratings, lifetime values, and product preferences. If you're a web publisher, your content-management system might store dimensions like author and article category. If you run an ecommerce business, you store item attributes like price, style, and size.

And you use Analytics to measure traffic and performance for your websites and apps.

Typically, each body of data exists in its own silo, uninformed by the other data. Data Import lets you join all this data in Analytics on a defined schedule in order to take down these silos, unlock new insights, and democratize your data.

How Data Import works

Uploading data

You upload CSV files that contain external data to an Analytics property. You can export those CSV files from an offline business tool like your CRM or CMS system, or for smaller amounts of data, you can create the files manually in a text editor or spreadsheet.

Data Import joins the offline data you upload with the event data that Analytics collects. The imported data enhances your reports, comparisons, and audiences. The result is a more complete picture of online and offline activity.

Joining data

Data is joined 2 different ways depending on the type of imported data:

  • Collection/Processing time: Your imported data is joined with Analytics data as the Analytics data is collected and processed, as if it were collected with the event, and the joined data is written to the Analytics aggregate tables. The imported data is not joined with Analytics historical data (data that is already processed). If you delete the imported data file, no further joins take place, but the joins that have already already taken place remain.

    User data and offline-event data are joined at collection/processing time.
  • Reporting/Query time: Your imported data is joined with Analytics data when you open a report and Analytics issues a query for the report data. This type of join is temporary: if you delete the imported data file, no further joins take place and the joined data will no longer be accessible in Analytics.

    Cost, item, and custom event data are joined at reporting/query time.

    Reporting/Query time data is not available when you're creating audiences in Analytics or when you're creating segments in Explorations.

When you import data, previously imported data will persist while appending any new imported data. Note that if the imported data has the same set of keys as previously imported data, the data will be overwritten.

Types of metadata you can import

Metadata

Importing metadata adds to the data already collected and processed by a property. Typically, metadata is stored in a custom dimension or metric, though in some cases you might want to overwrite the default information already gathered (for example, importing a product catalog with updated categories).

You can import the following data types:

  • Cost data: Third-party (non-Google) ad network clicks, cost and impression data
  • Item data: Product metadata like size, color, style, or other product-related dimensions
  • User data: User metadata, for example, a loyalty rating or lifetime customer value, which you can use to create segments and remarketing lists
  • Offline events: Offline events from sources that don't have an internet connection or that otherwise don’t support real-time event collection
  • Custom event data: Import event metadata via standard fields and/or custom dimensions

Limits

Data-source size 1GB
Daily uploads

120 uploads per property per day

 

Import data type Data source limit per property Storage limit per data type
Cost data Up to 5 1 GB across all import sources
Item data Up to 5 1 GB across all import sources
User data Up to 10 Not applicable
Offline events Up to 10 Not applicable
Custom event data Up to 5 1 GB across all import sources

You can find the current quota usage in-product through the "Quota information" button.

How to import data

When you import data, you create a data source. A data source is the combination of the CSV file you want to upload and a mapping of existing Analytics fields to the fields in your CSV. For example:

Do not upload a file that includes duplicate keys (for example, 2 fields named user_id)

You may refer to this article if you want to understand data sources

Prerequisites for using SFTP to upload data

If you plan to use the SFTP option in Step 5, make sure your SFTP server supports the ssh-rsa and ssh-dss host-key algorithms. Learn more about verifying which host-key algorithms you use and how to format the SFTP-server URL.

Start the import process

  1. In Admin, under Data collection and modification, click Data import.
    Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector. You must be an Editor or above at the property level to successfully start the import process.
  2. Create a new data source or select an existing data source. (Check the following sections.)

Create a new data source

  1. Click Create data source.
  2. Enter a name for your data source.
  3. Select the data type:
    • Cost data (query-time import only)
    • Item data (reporting/query-time import)
    • User data by User ID (collection/processing-time import)
    • User data by Client ID (collection/processing-time import)
    • Offline events data (collection/processing-time import)
    • Custom event data (reporting/query-time import)
  4. Click Review terms if prompted. This prompt is displayed if you are importing device or user data.
  5. Do one of the following:
    • Select Manual CSV upload, select the CSV file on your computer, then click Open.
    Or
    • Select SFTP.
    • SFTP server user name: Enter your user name for your SFTP server.
    • SFTP server URL: Enter the URL for your SFTP server.
    • Frequency: Choose the upload frequency (Daily, Weekly, Monthly).
    • Start time: Select the hour when you want the upload to start.
    • After the data source is created, the public key for your SFTP server will be visible in the interface where you're creating the data source, and it will be available in the data-source details (refer below).
  6. Click Next to proceed to the mapping stage.
  7. Select the Analytics fields and imported fields you want to map to one another. Edit the field names as necessary.
  8. Click Import.

Upload data to an existing data source

  1. In the row for an existing data source, click Import now.
  2. If the data source is configured for CSV import, then select the CSV file you want to import and click Open.

The CSV file has to include the same fields, or subset of fields, as the original. If you want to import different fields for the same data type, you need to delete the existing data source and create a new one.

Data import done in the Source Property will be automatically exported to both the roll-up and subproperties.

Verify SFTP host-key algorithms; format SFTP-server URL

Verify algorithms

There are different methods you can use to verify whether your SFTP server uses either the ssh-rsa or ssh-dss host-key algorithm. For example, you can use the OpenSSH remote-login client to check your server logs via the following command:

ssh -vv <your sftp server name>

If your server does support either of those algorithms, then you should notice a line like the following in your server log:

debug2: host key algorithms: rsa-sha2-512, rsa-sha2-256, ssh-rsa

Format SFTP-server URL

If your SFTP-server URL is badly formatted, your import setup will fail with an internal-error message.

An SFTP-server URL generally has 3 parts that you need to consider for uploading data-import files. For example:

sftp://example.com//home/jon/upload.csv has the following parts:

  • Domain: example.com
  • Home directory: //home/jon
  • File path: /upload.csv

In the example above, the upload file is located in the home directory.

You can format the domain portion of the URL in a variety of ways, using the domain name or the IPv4 or IPv6 address of the server, with or without a port number:

  • Domain name: sftp://example.com
  • IPv4 (with port number): sftp://142.250.189.4:1234
  • IPv4 (without port number): sftp://142.250.189.4
  • IPv6 (with port number): sftp://[2607:f8b0:4007:817::2004]:1234
  • IPv6 (without port number): sftp://[2607:f8b0:4007:817::2004]

If you do not include the port number, then the default port is 22.

You can correctly format the URL to include or exclude the home directory. The following examples of correctly formatted URLs use different formats to identify the domain. The examples include port numbers, but you may choose not to use the port number.

  • Include home directory:
    • sftp://example.com//home/jon/upload.csv (domain name)
    • sftp://142.250.189.4:1234//home/jon/upload.csv (IPv4 with port number)
  • Exclude home directory:
    • sftp://example.com/upload.csv (domain name)
    • sftp://[2607:f8b0:4007:817::2004]:1234/upload.csv (IPv6 with port number)

If your upload file is located in a subdirectory of your home directory, your URL would look something like:

sftp://example.com//home/jon/data/upload.csv

In this case, you can use the following types of formats:

  • Include home directory:
    • sftp://example.com//home/jon/data/upload.csv
    • sftp://142.250.189.4:1234//home/jon/data/upload.csv (IPv4 with port number)
  • Exclude home directory:
    • sftp://example.com/data/upload.csv
    • sftp://[2607:f8b0:4007:817::2004]:1234/data/upload.csv (IPv6 with port number)

If your upload file is not stored in your home directory (//home/jon) or a subdirectory of your home directory (//home/jon/data), and is instead stored in the directory /foo/bar, then the properly formatted URL for your upload file would look something like:

sftp://example.com//foo/bar/upload.csv (//foo/bar replaces the home directory)

View data-source details, get your SFTP public key, import new data, delete a data source

  1. In Admin, under Data collection and modification, click Data import.
    Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector.

    You must be a Viewer or above at the property level to view data source details.

  2. In the row for the data source, click and then.

You can view the name, data type, public key, and history of each upload.

  • Public key: The public SFTP server key that corresponds to a matching private key that Analytics stores (and never shares) that is used to ensure a secure, private connection between your server and Analytics data-import servers. It is vital that you authorize this public key on your server to ensure that data import can function safely and securely.
  • % imported: The number of rows successfully imported divided by the number of rows in the import file. 100% means all rows successfully imported.
  • Match rate: The ratio of keys in the import file that can be found in your property within the last 90 days. 100% means the data is fully useful and relevant for your data within the last 90 days.
    Note: % imported and match rate are relevant to cost, item, and custom event data import, but are not applicable to user data import or offline event data import.

To import new data:

Click Import now and choose the relevant CSV file on your computer.

To delete the data source:

  1. Click More > Delete data source.
  2. Read the deletion notice, then click Delete data source.

You can delete Collection/Processing-time data, but if you want to remove the data that was previously uploaded from all events processed by Analytics, then you may also need to follow up with a user or user-property deletion. Deleting an already imported file will not remove the processed data that has been associated with events collected since the import was completed. You may use this article to learn more about data-deletion requests.

Reserved names and prefixes

The following event names, event-parameter names, user-property names, and prefixes are reserved for use by Analytics. If you try to upload data that includes any of the reserved names or prefixes, that data will not be uploaded.

For example:

  • If you try to import an event with a reserved name, that event and its parameters will not be imported.
  • If you try to import an event with a valid name but one of the parameters uses a reserved name, the event will be imported but the parameter with the reserved name will not be imported.

Reserved event names

  • ad_activeview
  • ad_activeview
  • ad_exposure
  • ad_impression
  • ad_query
  • adunit_exposure
  • app_clear_data
  • app_install
  • app_remove
  • app_update
  • error
  • first_open
  • first_visit
  • in_app_purchase
  • notification_dismiss
  • notification_foreground
  • notification_open
  • notification_receive
  • os_update
  • screen_view
  • session_start
  • user_engagement

Reserved event-parameter names

  • firebase_conversion

Reserved user property names

  • first_open_after_install
  • first_open_time
  • first_visit_time
  • last_deep_link_referrer
  • user_id

Reserved prefixes (applies to event parameters and user properties)

  • ga_
  • google_
  • firebase_

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Google apps
Main menu
5106802076160174187
true
Search Help Center
true
true
true
true
true
69256
true
false