Codat to Azure SQL Data Warehouse

This page provides you with instructions on how to extract data from Codat and load it into Azure SQL Data Warehouse. (If this manual process sounds onerous, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)

What is Codat?

Codat provides a standardized data format across multiple accountancy software applications and financial APIs.

What is Azure SQL Data Warehouse?

Azure SQL Data Warehouse is a cloud-based petabyte-scale columnar database service with controls to manage compute and storage resources independently. It offers encryption of data at rest and dynamic data masking to mask sensitive data on the fly, and it integrates with Azure Active Directory. It can replicate to read-only databases in different geographic regions for load balancing and fault tolerance.

Getting data out of Codat

Codat exposes data through a REST API, which developers can use to extract information. To access information, you can use a GET method to retrieve data via the Codat API. Codat exposes data from customers, suppliers, invoices, bills, payments, creditNotes, and bankStatements endpoints. You can use an optional query parameter in the format [propertyName][operator][value] to select only certain data. Operators include "equal to" (%3d) and, for numeric and date values, "greater than" (%3e), and "less than" (%ec). So, to retrieve invoices for a customer whose ID is "61," you would call:

GET /companies/[companyId]/data/invoices?query=customerRef.id%3d61

Sample Codat data

A GET call returns a JSON object with all the fields of the specified dataset as a reply. Invoices, for example, have 11 possible properties, though all of them may not be present for any given record, so the JSON might look like:

{
    "id": "20",
    "invoiceNumber": "1001",
    "customerRef": {
      "id": "55",
      "companyName": "Oxon - Holiday Party"
    },
    "issueDate": "2017-01-24T00:00:00",
    "dueDate": "2017-02-23T00:00:00",
    "currency": "GBP",
    "totalAmount": 10800,
    "amountDue": 0
}

Loading data into Azure SQL Data Warehouse

SQL Data Warehouse provides a multi-step process for loading data. After extracting the data from its source, you can move it to Azure Blob storage or Azure Data Lake Store. You can then use one of three utilities to load the data:

  • AZCopy uses the public internet.
  • Azure ExpressRoute routes the data through a dedicated private connection to Azure, bypassing the public internet by using a VPN or point-to-point Ethernet network.
  • The Azure Data Factory (ADF) cloud service has a gateway that you can install on your local server, then use to create a pipeline to move data to Azure Storage.

From Azure Storage you can load the data into SQL Data Warehouse staging tables by using Microsoft's PolyBase technology. You can run any transformations you need while the data is in staging, then insert it into production tables. Microsoft offers documentation for the whole process.

Keeping Codat up to date

At this point you've coded up a script or written a program to get the data you want and successfully moved it into your data warehouse. But how will you load new or updated data? It's not a good idea to replicate all of your data each time you have updated records. That process would be painfully slow and resource-intensive.

Instead, identify key fields that your script can use to bookmark its progression through the data and use to pick up where it left off as it looks for updated data. Auto-incrementing fields such as updated_at or created_at work best for this. When you've built in this functionality, you can set up your script as a cron job or continuous loop to get new data as it appears in Codat.

And remember, as with any code, once you write it, you have to maintain it. If Codat modifies its API, or the API sends a field with a datatype your code doesn't recognize, you may have to modify the script. If your users want slightly different information, you definitely will have to.

Other data warehouse options

Azure SQL Data Warehouse is great, but sometimes you need to optimize for different things when you're choosing a data warehouse. Some folks choose to go with Amazon Redshift, Google BigQuery, PostgreSQL, Snowflake, or Panoply, which are RDBMSes that use similar SQL syntax. If you're interested in seeing the relevant steps for loading data into one of these platforms, check out To Redshift, To BigQuery, To Postgres, To Snowflake, and To Panoply.

Easier and faster alternatives

If all this sounds a bit overwhelming, don’t be alarmed. If you have all the skills necessary to go through this process, chances are building and maintaining a script like this isn’t a very high-leverage use of your time.

Thankfully, products like Stitch were built to move data from Codat to Azure SQL Data Warehouse automatically. With just a few clicks, Stitch starts extracting your Codat data via the API, structuring it in a way that is optimized for analysis, and inserting that data into your Azure SQL Data Warehouse data warehouse.