When thinking about “Big Data”, a major challenge is often what one should do with the data after it’s collected. How do we store, visualize and ultimately analyze extremely large data sets?

Before that can happen, though, Turnkey’s clients are first concerned with identifying where all their data exists and how it can all be brought together.

Because our Audience Portal clients run their businesses on data, it is no surprise that we are asked almost daily to assist clients with connecting multiple data sources. For example, we may be asked – or required – to receive data from a ticketing platform, process those customer records in Prospector, and return the enhanced demographic and lead scoring data set to a CRM system or data warehouse. Another example might be delivering survey respondents to a loyalty platform to award them points for taking surveys.

The scenarios are limitless, so Turnkey had to develop a solution to make it simple to connect the Audience Portal to virtually any other platform, without involving an IT specialist for each custom project. That solution, to be deployed to our clients this winter, is DataPort – a plug and play method for transferring data securely and on schedule.

Most vendors tend to focus on providing flexible APIs which allow other vendors to develop custom solutions against their platform. In theory, this solves the problem – “Yes, you may have your data. Please help yourself to our API.” Unfortunately, this leaves the client with a lot of work. Who is going to implement these “techie” APIs? How much is that going to cost? how long will that take? Which vendor is responsible for the heavy lifting? The API answer always falls a little bit short of client expectations… which is why Turnkey Intel moved away from pushing clients to our own API and towards DataPort.

DataPort is a plug-and-play solution that allows Audience Portal clients to connect two systems. Here’s an example of how to export Prospector data to a remote file location using DataPort.

  1. Choose an origin, such as Prospector, and a destination, such as a remote FTP server.
  2. Determine which data fields to export from Prospector to an excel or .CSV file, and name the columns accordingly.
  3. Name the export.
  4. Set a recurring schedule for when the data transfer should happen such as every morning at 5 AM.
DataPort example

A sample DataPort automation.

Done! Now your Prospector data set of choice is scheduled to be shipped to your secure FTP location for pickup each morning.

It’s as easy as that. Potential DataPort sources/connections currently include, FTP (including SFTP), Surveyor, Prospector and Formstack. While connections to and from FTP cover nearly all client needs, we have the ability to add additional connection endpoints to DataPort. Any system with an API can potentially be added to DataPort for plug and play connectivity. We will plan to add additional endpoints as we see a common needs develop among clients.

We’re hopeful that our clients will see immediate value in DataPort. Goodbye, complex API implementations, and the resources needed to make them work for you; hello, DataPort!

#####