The use and processing of data, and specifically data related to Human action has become pervasive and in some cases controversial and opaque. At Carbon we are striving to build commercial tools that are beneficial to people as well as simply achieving functional goals and addressing technical problems. We believe that core to this are the principles of choice and of a value exchange between ourselves, our clients and the end user. Establishing Carbon as a leader in ethical data custodianship and data brokering, we will foster a heightened level of trust between people and technology that is needed for its fruitful use in our daily lives.

On this page we set out the principles we believe demonstrate our ethical ‘dna’. We first describe the core set which lead to being able to define Carbon as trustworthy. We then set out our stance on what it means to use the data in our care in an ethical manner. Our policy guidelines come next which are more concrete and used to ensure our systems remain human-centric, serving ethical values and principles.

Our Data Principles

There has been a great deal of recent research and proposals across the data ecosystem. We have used this work whether from government, voluntary trade bodies such as the NAI and IAB, independent agencies such as the ICO, charities, and both large and small companies to help us define our principles.

1. Data Agency

We will empower individuals with the ability to access their data and will maintain people’s capacity to have control over their identity

Individuals are in control of whether and how their data is used. We comply with regional legislative requirements such as GDPR and CCPA and consent must be provided for the different purposes as specified by TCF2.0. If consent permits we may collect and/or match on a hashed email address as part of an identity resolution mechanism.

We also agree with the CMA’s ‘Fairness by Design’ principle that places a duty on platforms to make it as easy as possible for users to make meaningful choices.

We offer a mechanism to allow an individual to access the data that has been collected based on the device they are using and the ability for them to immediately remove the association of the data to their device.

2. Transparency

We will ensure the basis of any data processing and actions are always discoverable.

This is important:

  • For users, what the system is doing and why. 
  • For creators, to allow them to validate and achieve certification of their intelligent systems. 
  • For incident investigators, to understand what caused the incident. 
  • For those in the legal process, to inform evidence and decision-making. 
  • For the public, to build confidence in the technology

We are able to trace the data and explain the reasoning for actions the Carbon system takes.

It’s worth noting that within the wider industry, the direction of travel is to revamp the advertising ecosystem to regain user trust. One example of this is the Ads Transparency Spotlight, a chrome extension that makes it easier for consumers to understand how and why ads are being shown. Another recent release is an early stage technology called the Ads Transparency Spotlight Data Disclosure schema which allows ad tech providers to describe how their ads work and why they are on the page.

3. Security

We will guard against possible misuse and ensure data is kept safe.

Individuals should know that their data is secure and protected from misuse. Protecting data from non-authorized access involves the use of network security, access controls and judicial use of encryption (for example on data egress). Most cloud environments provide such capabilities but these must be set up and monitored and even with correctly configured infrastructure other attack vectors are possible. Our ISO 27001 accreditation emphasises the importance we place on establishing controls and guards to protect against potential misuse.

4. Accountability

We will develop practices for holding ourselves accountable to shared standards.

Accountability in the ad tech domain means that organisations who do not adhere to the regulations are held accountable. Carbon adheres to the highest standards and will only use consented personal or device data. If it’s not consented we won’t use it and we won’t delve into the murky grey world of trying to get around the standards.

The Centre for Data Ethics and Innovation Review of Online Targeting suggests that a code of practice is created that requires organisations to adopt standards of risk management, transparency and protection of people who may be vulnerable, so that they can be held to account for the impact of online targeting systems on users. 

Carbon supports this and in our view this falls into the following:

  • The creation and documentation of our approach to risk. An example of this is our stance on sensitive signals described below.
  • Being responsive when failures occur 
  • Providing mechanisms to allow the clear articulation of the rationale behind any decisions or outcomes made using data. Our intention is to support this through the embedding of data structures and processes into the architecture of the system to support the explanation of decisions made.

Communicating clearly our stance and our association or subscription to relevant codes of conduct such as the NAI Code of Conduct.

5. Trust

We will be responsive to users, use plain language to communicate clearly and be consistent in all we do.

Business leaders recognize the importance of building trust with individuals and know that trusted businesses generate more revenue – indeed the long-term success of the industry depends on public and client trust.  

The previous guiding principles allow Carbon to position itself as an open and transparent data management system. Users can opt in or out, control how their data is used, and obtain an unambiguous rationale as to how any decisions that used their data came about. 

To become trustworthy also requires us to be responsive when failures happen and use plain language to communicate clearly with consumers. Our privacy policy,, illustrates this principle by describing in straightforward terms why we collect data, what data we do collect, how we use it, and how to opt out. 

6. Ethical Use of Data

We now turn to the principles which set out our commitment to how Carbon will use the data it sees. These set the boundaries, describe how we deal with sensitive data, and our approach to limiting bias in our AI based analytics and inference components.

  • Limits of use

    We will only operate within the advertising and marketing industry with our aim being to help unlock the value in our publishers’ data.

    Carbon operates in the ad-tech domain and our mission is to make publishing more profitable, thus supporting an open and free internet for the benefit of consumers. Many publishers offer free content and a major source of income is through advertising on their sites. Carbon’s technologies help to improve the revenue that a publisher makes from their sites by allowing publishers to supply to their advertisers cohorts of relevant users. This means that users tend to see ads that are relevant to them and are more likely to click-through to the advertiser.

    Our clients and partners are vetted through our sales selection criteria and our on-boarding process which includes compliance checks such as ensuring they have set up their consent management platform correctly and are abiding by the relevant legal and industry frameworks. Furthermore we work closely with them to help create relevant and transparent audiences for their business. 

    Where we capture data for purposes other than tailored advertising we will ensure that such data is only used for its consented or legitimate purpose. For example if we capture data only for research purposes then it should only be used for research.

  • Unfair bias

    We will work to avoid unfair bias in our models and inference mechanisms.

    It is imperative that we are mindful of how bias can become embedded in our models and we are aware that AI algorithms and datasets can sometimes reflect or reinforce societal biases.  As an example the first cities that deployed a smartphone technology to prioritize road maintenance saw wealthy communities receive the most attention—because those were the people with the most smartphones. The well-intended system amplified existing economic inequality issues and damaged public trust

    Carbon does not use sensitive data (see below) and for other non sensitive data sets we will challenge ourselves and the data providers if we feel the data is biassed or skewed. Where we can, we build our models from the raw data we collect. These models do not suffer from potential bias induced through modelling on a restricted subset of the population and then applying to all.

  • Sensitive signals

    We will not process sensitive data.

    Carbon’s mission statement is to make publishing more profitable. We believe that we can execute against that whilst taking measures to ensure we set the highest standards in data quality and compliance, including our approach to sensitive data.

    Carbon has decided to take a stance of NOT processing such data either directly or through 3rd parties and to ensure we take into account regional regulations and guidelines.

  • Scientific excellence

    We will incorporate scientific rigour into our modeling and prediction components to support our aspirations of scientific excellence.

    We make significant efforts to collaborate with academic institutions in order to ensure our algorithms are of the highest quality. Our first knowledge transfer partnership project with Durham University won a national award and our second KTP is currently underway. We won an European Regional Development Fund sponsored PhD with Durham University and have recently had a paper awarded at the internationally important British Machine Vision Conference 2020.

    We apply similar rigour to monitor the performance of any of our data, models or decision making mechanisms to ensure they maintain their high quality. This is also the case when we use 3rd party data and we will assess such data and challenge the provider if we find the quality is lacking. As an example we challenged the data science model of a tier 1 data provider for some of their demographic data and as a result this model was changed so that it was not skewed.

Our commitment to ethics

These principles set out our commitment to build ethically designed software, to develop an open and transparent partnership with our users and clients, and to only operate within ethically responsible application areas and use of data. 

Carbon’s mission is to make publishing more profitable and by being transparent, accountable, secure, and promoting choice, we will be known as a trustworthy leader in ethical data custodianship and data brokering. Creating a value exchange between ourselves, our clients and the end user will support an open and free internet which will benefit both clients and consumers.

Carbon is now part of Magnite.