Mount Alpine Logo

BBC Media Action Data Portal: Case Study

The Data Portal required remote API access to our proprietary survey data collation engine. Here's how we did it.




The BBC contacted the SwissPeaks Group to aid with a survey-data processing and homogenisation project for their Media Action department. Initially, SwissPeaks (via <mount/alpine>) were tasked with receiving raw data from fieldwork, cleaning it, and sending it on to another company for use in a data portal, in the form of a ready-to-use database. This data portal was to be a public resource to allow anyone to examine the collected data, across all countries and "Themes" (provided by the BBC) in a compact and easy manner.

As detailed below, the brief changed as the project started and the technical difficulties inherent when working with survey data surfaced.

Technical Difficulties in Survey Data

As many researchers are aware, survey data has some complicated and nuanced meta-data properties that do not allow it to be treated in the same way as other data types. Briefly, these properties are:

  • The MULTI
  • Respondent level storage
  • Translations
  • Weights
  • Nets and Derived Variables

These properties tend to confuse and confound engineers who are more used to standard data types (sales, traffic, anything scalar etc), especially the multi-type variable. Upon delivering our ready-to-use database, the web-software company tasked with the front-end found it a little difficult to understand - and we totally get it.

The Proposed Solution

After some deliberation on how between <mount/alpine> and the web-application company could work to deliver the data portal, we decided it best to take over the back-end development in-house, and offer an easy to use RESTful API for the web-application company to work with.

Having already built the homogenised database, we would be able to quickly start development on the API, using our proprietary survey data collation engine as our start-point.

As well as being RESTful, the API also needed to:

  • Deliver "Chart Ready" data back to the charting solution
  • Allow the calling system to access up to date meta-data to auto update
  • Cater for unavoidable region/theme specific scenarios (problems) that were introduced at the fieldwork stage
  • Be packaged in a low footprint file structure to allow mounting on remote servers


The API was delivered in time enough for the Data Portal to launch without hitch. As usual, our development guru's worked to our high coding and best-practice standards and delivered a system that out prformed the brief. The API continues to provide live data, found at:

Addtitional Offerings

Alongside the development of the collation API, we also worked with the web-application companies' front-end developers to get them up to speed using the javaScript charting solution chosen for the data portal. <mount/alpine> have many years of experience working with AMCharts, Kendo, Kendo UI, JQuery and other related canvas-drawing front-end libraries, and we were able to integrate our API calls directly into the code controlling AMCharts. 





For those who are more development oriented, here's a jargon-ridden breakdown statement of the project:

<mount/alpine> delivered a RESTful API that included a SQL database, built in PHP 7.1 using PSR2/4 standards on the Lectric ( framework. The API accepts JQuery ajax calls made from the front end, using hash-generated API keys and always provides graceful error fallback handling. The API has been packaged in composer, allowing remote install via packagist. Required to serve are Apache, MariaDB, handling PHP via fastcgi.


This website uses cookies. Privacy Policy