Scroll to top

GUEST CHAT with David Ranner, Commercial Director at CameraForensics

In our sixth interview with an External Guest we meet with David Ranner, the Commercial Director of CameraForensics. In his job his focus is people and partnerships – whether that’s the team, technical collaborators, international partners, the law enforcement community or government agencies. He wants to see the issue of online safety rise on the global political agenda and views CameraForensics as ideally placed to protect people from the crime of grooming and sexual exploitation.

In our Chat, David talks about how one phone call can change the fate of the company, having a “North Star” purpose that makes all the decision-making the company must do pretty easy. He also gives some valuable parental advice in helping to navigate your kids’ online safety and discusses how customers’ feedback motivates and inspires him to do work.

Tell us a few key things about CameraForensics.

Our mission is to provide technologies to help with the problem of online child exploitation. The company was started 10 years ago by Matt Burns who got burgled. He is a photography and technology geek, and he knew that serial numbers are imbedded into digital photographs. He thought if he could find if someone was using his stolen camera on the internet, he could potentially track down who had broken into his flat.

He discovered that nobody provided the service to search for the serial numbers on the Internet so put together some software to do it as a hobby project, called StolenCameraFinder.

This came to the attention of the National Crime Agency, Child Exploitation and Online Protection Command, who thought: “Can we use it for child victim identification in online abuse cases. At the time Matt knew nothing of this world.

How did this initial interaction happen? Was it through Matt’s personal contacts or how?

It was actually a very clued-up investigator who contacted Matt out of the blue. He discovered that investigators all around the world used StolenCameraFinder for online child abuse cases and we hadn’t realised it. So, we started talking to them and we got invited to the Interpol “Crime Against Children” meeting in Lyon back in November 2015. We went along and we got stunned by the amazing people, how much great work they’re doing and how much is yet to be done. When we came back, we decided to re-purpose the whole company. We thought this was an important problem and we wanted to help to solve it.  

What cool technologies support your work?

I am not a fan of buzz words, but if you insist, we work with what is commonly called ‘Big Data’. We use web crawlers to look for images. Our underlying crawling technology is StormCrawler and we’re lucky enough to have the author of this technology on the team, which is brilliant! We typically process about 10 million images a day. We apply a lot of different tech to process these images and extract information from them, both from the metadata and the pixels. We also use machine learning algorithms to extract information that may be useful to investigators.

These 10 million images a day that you process, is it for a particular case? Can you please talk about the process?

When you do Google search, Goggle doesn’t go into Internet to search for things. It goes to the library they’ve already built. It’s like looking something up in the Index of an encyclopaedia. It’s similar to us.

Our web crawlers are always looking for images. They do it in a broad and shallow way (more images from the various devices, rather than lots of images from the same device). And we’re not looking at any specific images, images taken by a particular camera, etc. We want any images we can find. Every time we find an image, we do a lot of calculation and computation on it, and we put all the results into this database. We have several billion images in our database. So, the actual data collection is done upfront, agnostic of the purpose. And then we can search the database for the requests we receive from the investigators. This is one part of it. We also do research and development with the users.

It is scary to think that the offenders are just out there using the Open Web.

If you look at the most tech savvy offenders, they are really serious adversaries, they are really clued up in trade craft, in technologies. They are very hard to catch and are comparatively small in number. You’ve also got a huge wave of the offenders, who aren’t tech savvy, who are naïve, make mistakes. And there is a lot of this stuff that’s going on in the Open Web.

You develop the technologies and methods to help the investigators. How close are you to the cases? Have you ever had to look at the photographs?

As a company we made a decision quite early on that we’d never make our staff look at the indecent photographs of children. It’s a line that once you cross it, you can’t go back and unsee it. And it can be psychologically damaging. I’ve got an immense respect for the investigators who work with it day in and day out. But of course, we do hear about the cases and see sanitised images. However, we don’t actually see the images and within the UK we also can’t do it. The law doesn’t allow it.

What’s the company’s culture like?

As a company we try to first and foremost understand that our staff have lives and commitments outside of work! This comes first and the fact that they work for us is secondary. We try not to be hierarchical. How our current approach will scale up when we grow more is yet to be seen [laughing].  Everybody’s got a valuable opinion. So we try to give people a larger degree of autonomy and be open. We try to keep the cool, fun, geeky stuff within the company and we outsource other things. And people come to work for us also because of the subject matter – knowing that the software you’re writing can help to solve some important issues.

We do occasionally get e-mails from customers, saying: “I used your software and I rescued a child. Thanks.” That’s pretty amazing. It doesn’t happen often, but it happens often enough.

What are the current areas of your interests?

To develop more tools to help victim ID investigators. We’ve got a good R&D roadmap of technologies that all focus on online investigations and safety. The internet has had a massive impact on our children and how they grow up – some good, some bad. But we can’t switch it off now, it exists. It’s now about how we can mitigate that, how we can use the technology to help our children to have the right experiences growing up. But it’s not just a technology problem – its societal. One of the key things is “don’t ban your kids from doing stuff”, for instance accessing social media, because they still will likely go and do it anyway behind your back. You need to have the conversation going with your kids, so if something bad or worrying happens to them online, like they get approached by somebody, they can tell you about it.

What are the companies out there you keep an eye on and why?

It is a wonderful community of people. A company in Sweden, without which we wouldn’t be who we are today, is a company called Griffeye. They are real leaders in image investigation software. We have wonderful partnership with them. They are great people and doing some fantastic work. In the UK there are some great companies working in this field, and internationally we are lucky to have excellent relationships with a number of NGOs like Thorn, the Canadian Centre for Child Protection and Global Emancipation Network. They’re all doing some really amazing work!

What is the technological dynamic like between the good guys and the baddies?

It is often described as an arms race. I’d say the very savvy bad people are ahead. As an institution, some law enforcement agencies find it quite hard to innovate or access innovative technologies. I would say law enforcement is ahead of the pack of the less tech savvy bad people. So it is a bit of a mix and it is an interesting dynamic. And just because the tech has been understood by some offenders and it has been defeated by others doesn’t mean that it shouldn’t be used any more. It is like with gloves and fingerprints.

What inspires you at work?

Every time we make a big decision, we think – will this help the investigators to find more children? This is what we call our North Star, which makes the decision making remarkably easy. I personally get very inspired by the people we work with – our users and stakeholders. I now have friends from across the world. And, of course, our team. They are amazing!

What are your thoughts on the privacy versus crime prevention / child protection debate and legislation in this area? How do you think they can be balanced?

I don’t think it’s the right question. I don’t think it’s privacy vs solving crimes. You need to think it’s not just about the privacy of your ability to communicate online without somebody reading it. Think about the privacy of an abused child and their right to privacy. Some of the legislation, particularly in the EU has recently been stopping technology companies detecting known abuse material on the networks. I see this type of technology more like a virus scanner or filters. We don’t mind virus programmes checking our machines. We’re happy for our e-mails to filter spam mails. Why can’t we also be alright with checking for child abuse material? I do understand people’s concerns with privacy, where the rights to privacy can step-by-step be taken away and, suddenly, for example, we can’t say anything negative against the government, etc. So, it is a highly complex subject. But the complexity of it should not be used as an excuse not to deal with child abuse material.  There are some  organisations doing some great work – thought leadership and lobbying,  particularly WeProtect. I also have a lot of respect for the Canadian Centre of Child Protection stance on these issues. They are not shy about taking industry to task where necessary, and are doing some brilliant work. There are currently come real disconnects in how we think about and approach these things. For example it is so easy to upload a video on to an adult pornography site. Trying to get something removed is really difficult. This is something that’s got to change.

Author avatar
Decision Lab
https://decisionlab.co.uk/
We use cookies to give you the best experience.