DASTRA
English
English
  • What is Dastra
  • 🇪🇺USEFUL REMINDERS
    • What is GDPR ?
    • GDPR key concepts
      • Personal data
      • Record of processing activities (ROPA)
      • Privacy impact assessment
      • Data retention period
      • Data Subject Rights (DSR)
      • Privacy by design and by default
      • Security measures
      • Data breach notifications
    • Risk management
      • Definition of risks
      • Risk assessment
      • Vendor risk management
  • 🧑‍🎓GETTING STARTED
    • Setting up
      • Create and set up a workspace
      • Create and set up organizational units
      • Appointing a DPO
      • Add a lead authority
      • Invite users
      • Managing roles and permissions
      • Create and assign teams
      • Frequently asked questions
    • Tutorial
      • Step 1: Setting up
      • Step 2: Map your personal data processing and draw up a register
      • Step 3: Managing risks
      • Step 4: Prioritize actions
      • Step 5: Implement internal processes
      • Step 6: Document compliance
    • Support
      • The dastronaut's assistant
      • Online help
      • Request support
      • The customer support process
  • ⚙️Features
    • Dashboard
    • General
      • Advanced Filters
      • Import your data (Excel, Csv)
      • Tag management
      • Custom fields
      • AI Assistant
      • Email templates
    • Data Mapping
    • Record of processing activities
      • "Data controller" record
      • "Data processor" record
      • Establish your record
      • Export / import the record
      • Use a processing activity template
      • Declare a processing activity
      • Complete a data processing activity
        • General information
        • Stakeholders
        • Purposes
        • Dataset
        • Assets
        • Data subjects
        • Data subjects rights (DSR)
        • Recipients
          • Data transfers outside the EU
        • Security measures
        • Impact analysis
        • Documentation
      • Create relationships between processing activities
      • Processing freshness
      • Share the record of processing
      • Data visualization
        • View the treatment tree
        • View the record data map
        • View the transfers map
      • Frequently asked questions
    • Audits and DPIA
      • Create or modify an audit template or DPIA
      • Scheduling an audit or a PIA
      • Share an audit report or PIA
      • FAQ
    • Privacy hubs
      • Create a Privacy hub
      • Configure your Privacy hub
        • Homepage and general configuration
        • Questionnaires
        • Data subject requests
        • Record of processing activities
        • Attachments
        • Organizational chart
        • Contacts
        • Security
        • Appearance and design
      • Preview and share your privacy hub
      • Collecting data processing projects from a Privacy hub.
    • Contracts
      • Declare a Contract
      • Structure of a contract
      • Documents
      • Assets
      • Signers
      • Linked users
      • Sign the contract
      • Docusign integration
      • Contract versions
      • Contract templates
    • Risk management
      • Glossary of terms
      • Risk management process
        • 1. Identification
        • 2. Assess
        • 3. Monitor
        • 4. Control
        • Let's recap
      • Dastra / eBios RM comparison
      • Attach a risk to a processing activity
      • FAQ
    • Planning
      • Create your action plan
      • Create or modify a project or an iteration
      • Monitor, screen or export your tasks
      • Customise the task workflow
      • Share as calendar
      • Customise the task workflow
      • Go further with planning
      • FAQ
    • Data subject right request
      • Manage data subject right requests
      • Set up a data subject right request widget
      • Technical integration
      • API integration
    • Manage data breach notifications
      • Report a data breach
      • Export your data breach notifications
    • Manage cookies consent
      • Widget configuration
        • Preliminary study
        • Cookies scanning
        • Classify cookies by consent categories
        • The purposes of cookies
        • Implement a cookie consent widget
        • Collect proof of cookie consent
        • Go further on cookie consent
        • In case of unavailability
      • Technical integration
        • Functioning of the widget
        • Quick start
          • Wordpress
        • Language management
        • Test the integration of a widget
        • Blocking cookies
          • Blocking iframes (twitter/youtube...)
          • Google Tag Manager
        • Advanced Design
        • Manage consent programmatically
        • User identification
        • Mobile applications
          • Hybrid applications
          • Native applications
        • TCF 1.1/2.0
      • RGAA compliance
      • Breakdown service
    • Regular review (freshness)
    • Custom Reporting
      • Integration with data analysis tools (BI)
    • AI Systems
      • Establishing a record of AI systems
      • Risk analysis and business value
      • Transparency notice
      • AI Models repository
    • Advanced configuration
      • SCIM
      • Roles and permissions
      • Single Sign On (SSO)
        • SAML 2
        • OpenId
        • ADFS
        • Active Directory
        • Okta
        • Known problems
      • References
      • API key management
      • Notifications
      • Workflow steps / process flow
      • Incoming mail data collection
      • OneDrive/Google Drive integrations
      • Webhooks
      • SMTP configuration
      • Workflow rules
      • Message templates
      • Email domains
  • PARTNERS
    • Portal
  • 📄API documentation
    • Configuration
    • Authentication
    • API References
    • Integrations
      • Frequently asked questions
  • 🛡️Security
    • Security at Dastra
    • Security roadmap
    • Quality of Service
  • Certifications
  • 🤖Other
    • FAQ
    • Known problems
    • Changelog
  • Referentials
    • CNIL referentials
      • HR referential from CNIL
Propulsé par GitBook
Sur cette page
  • Setting up in Dastra
  • Power BI:
  • Google Looker
  • Tableau Software

Cet article vous a-t-il été utile ?

  1. Features
  2. Custom Reporting

Integration with data analysis tools (BI)

Learn how to integrate Dastra into your favorite data analysis tool!

Dernière mise à jour il y a 5 mois

Cet article vous a-t-il été utile ?

To use this feature, your subscription must include the Enterprise plan.

Dastra makes it easy to import into your favorite BI tool (Power BI, Google Looker, Tableau Software...). The advantage of using this feature is that you can benefit from the power of BI tools to analyze data in depth, while having reports that refresh automatically

Setting up in Dastra

Go to the customized reports module, access one of them and click on the “Integrate with a BI tool” button.

A modal window opens, click on “Create integration link”.

Copy the generated link

Please note! Do not transfer the generated link to anyone. It will give you access to the raw data in the report.

Power BI:

  1. Open Power BI Desktop :

    • Launch Power BI Desktop on your machine.

  2. Connect to the Web data source:

    • In Power BI Desktop, go to the Home tab.

    • Click on Get Data, then select Web.

  3. Enter the URL of the JSON link:

    • In the Get Data Web window that appears, enter the URL of your JSON file or API (if you have a JSON share link).

    • Make sure the URL points to a JSON file or API that returns JSON data.

    • Click on OK.

  4. Authentication (if required):

    • If the URL requires authentication (for example, if it is protected by a password or API key), Power BI will ask you to log in.

    • Select the appropriate authentication type (e.g. Anonymous, OAuth2, API Key, etc.) and enter your credentials if necessary.

  5. Access JSON data:

    • Power BI will connect to the URL and retrieve the JSON file.

    • Once successfully connected, a JSON data preview window will appear. You can explore the data in the form of nested lists, objects or tables.

  • Performance: If your JSON file is large or contains deep hierarchies, Power BI may take some time to load and transform the data.

  • Data refresh: If JSON data is regularly updated (e.g. by an API), you can set up a data refresh in Power BI Service to retrieve the latest data automatically.

Google Looker

Google Looker connects easily to Google BigQuery, a Google Cloud data warehouse that can store and process JSON files.

1. Using Google BigQuery as a data source

  1. Upload the JSON file to Google Cloud Storage :

    • If your JSON file is accessible via a share link, start by uploading the file to Google Cloud Storage.

    • Make it public or set the appropriate permissions.

  2. Load JSON data into BigQuery:

    • Access the Google Cloud Platform console (Google Cloud Console).

    • Go to BigQuery and select or create a project.

    • In BigQuery, select your dataset and click on Create Table.

    • Select Google Cloud Storage as the source and enter the URL of your JSON file in the URL field.

    • Set the data format to JSON and make sure BigQuery can correctly interpret the structure of your file.

  3. Connect BigQuery to Looker :

    • Once the JSON file has been loaded into BigQuery, you can connect Looker to BigQuery.

    • In Looker, go to Admin > Connections and add BigQuery as a data source.

    • Once the connection has been established, you can query BigQuery data directly in Looker and visualize it.

2. Use ETL tools to integrate JSON into Looker

If you need to automate the integration of JSON data from an API or external file into Looker, you can use ETL (Extract, Transform, Load) tools such as Fivetran, Stitch or Airflow. These tools can extract JSON data, transform it and load it into a Looker-compatible data warehouse.

Example with Fivetran :

  1. Connect a REST API or JSON source to Fivetran.

  2. Transform the data to match the requirements of your data warehouse.

  3. Load the data into the warehouse (e.g., BigQuery).

  4. Connect Looker to this data warehouse to start querying and visualizing the data.

Tableau Software

1. Open Tableau Desktop :

  • Launch Tableau Desktop on your machine.

2. Connect to a Web data source:

  • In Tableau Desktop, go to the File menu and select New Data Source.

  • In the data source selection window, click on Web Data Connector. This option lets you connect to data from various web sources, including JSON files via an API or URL.

3. Enter JSON URL:

  • You'll need to enter the URL of the JSON file or API in the window that appears.

4. Authentication (if required):

  • If the JSON URL requires some form of authentication (e.g. an API key, OAuth token, or login credentials), Tableau will ask you for credentials to access it.

  • Select the appropriate authentication type (e.g. API Key or OAuth) and enter the necessary information.

5. Analyze JSON data in Tableau :

  • Tableau will connect to the URL, download the JSON file and attempt to convert it into an understandable tabular format (a table or structured data view).

  • Tableau can automatically detect the structure of JSON data, but in some cases, if the file is very complex or nested, you may need to make adjustments in the data editor to flatten or transform the data into an appropriate format.

  • JSON file limitations: If your JSON file is large or highly nested, Tableau may have difficulty parsing it correctly. Sometimes it may be necessary to transform or simplify it before integrating it.

  • Authentication and security: If your JSON link requires an API key, be sure to manage authorizations and credentials. Some APIs may have limitations on the number of requests or require specific authorizations.

  • Data refreshment: If JSON data changes regularly (for example, via an API that provides real-time updates), you can configure Tableau to automatically refresh data at regular intervals, to keep your reports up to date.

⚙️
customized reports