cryptographic web of trust

As our identity moves more and more onto the Web our ability to have people trust that what we write has not been altered is becoming increasingly important. As our home page becomes our OpenId, linking to our CV, blogs and foaf information, it will become more important for services to be able to trust that the information they see really is what I wrote. Otherwise the following scenario can become all to easy to imagine. Someone breaks into my web server and changes the OpenId link on my home page to point to a server they control. They then go and post comments around the web using my openid as identity. Their server of course always authenticates them. People who receive the posts then click on the OpenId URL which happens to be my home page, read information about me, and seeing that information trust that the comment really came from me.

What is needed is some way to increase the methods people can have to trust information I state. Here I describe how one can use cryptography to increase that trust level. Starting from my creation of a PGP key, I show how I can describe my public signature in my foaf file, used PGP to sign that file and then link to the signature so that people can detect a tampered file. From this basis I show how one can build a very solid cryptographically enhanced web of trust.

Creating your PGP public key

After reading the first part of "SafariBooks Online, and feeling comfortable that I understood the basics, I decided it was high time for me to create myself a Public PGP key. Reading the GPG Manual and a few other HOWTOs on the web, using the gnu GPG library, I managed to make myself one quite easily.

Linking to the Public Key from your foaf file

Having this it was just a matter of placing it on my web site and using the Web Of Trust ontology developed by Dan Brickley to point to it and describe it, with the following triples:

@prefix wot: <> .
@prefix : <> .

:me  is wot:identity of [ a wot:PubKey;
                        wot:pubkeyAddress <>;
                        wot:fingerprint "0DF560B5DADF6D348CC99EA0FD76F60D4CAE10D7";
                        wot:hex_id "4CAE10D7";
                        wot:length 1024 ] .

The is wot:identity of construct is a nice N3 short hand for refering to the inverse relation of wot:identity, without having to name one. This states that my public key can be found at, and describe its fingerprint, key length and its hex id. I am not sure why the wot:PubKey resource has to be a blank node, and can't be the URL of the public key itself, which would make for the following N3:

:me  is wot:identity of [ = <>
                          a wot:PubKey;
                          wot:fingerprint "0DF560B5DADF6D348CC99EA0FD76F60D4CAE10D7" ] .

Perhaps simply because it is quite likely that one would want to put copies of one's public key in dfiferent places? owl:sameAs could have done the trick there too though...

Signing your foaf file

Anyway, once that is done, I want to be able to sign my foaf file. Of course it would be pretty tricky to sign the foaf file and put the signature into the foaf file simultaneously, as that would change the content of the foaf file and make the signature invalid. So the easiest solution is to simply have the foaf file point to the signature with something like this [1]

<> wot:assurance <card.asc>

The problem with this solution is that my foaf file at currently returns two different representations: a rdf/xml one and an N3 one depending one how it is called. (More on this in I have a Web 2.0 name!). Now a signature is valid only for a sequence of bits, and the rdf sequence of bits is different from the N3 sequence, so they can't both have the same sig. There is a complicated solution developed by Jeremy Carroll in his paper Signing RDF Graphs, which proposes to sign a canonicalised RDF graph. The problem is that the algorithm to create such a graph does take some time to compute, only works on a subset of all RDF graphs, but mostly that no software currently implements that algorithm.
Luckily there is a simple solution, which I got by inspiration from my work on creating an ontology for Atom, Atom OWL, and that is to link my card explicity to their alternate representation (the <link type="alternate" href="...">) [2] and to sign those alternate representations. This gives me the following triples:

@prefix iana: <> .
@prefix awol: <> .

<>   a foaf:PersonalProfileDocument;
     iana:alternate <>,
                    <> .

       wot:assurance <> ;
       awol:type "application/rdf+xml" .
       wot:assurance &l> ;
       awol:type "text/rdf+n3" .

Here I am saying that my card has two alternate representations, an rdf and an n3 one, what their mime type is, and where I can find the signature for each. Simple.

So if I put all of the above together I get the following extract from my N3 file:

@prefix foaf: <>.
@prefix wot: <> .
@prefix awol: <> .
@prefix iana: <> .
@prefix : <> .

<>   a foaf:PersonalProfileDocument;
     foaf:maker :me;
     foaf:title "Henry Story's FOAF file";
     foaf:primaryTopic :me ;
     iana:alternate <>,
                    <> .

       wot:assurance <> ;
       awol:type "application/rdf+xml" .
       wot:assurance <> ;
       awol:type "text/rdf+n3" .

:me    a foaf:Person;
       foaf:title "Mr";
       foaf:family_name "Story";
       foaf:givenname "Henry";
       foaf:openid <> ;
       foaf:openid <> ;
       is wot:identity of [ a wot:PubKey;
                            wot:pubkeyAddress <ttp://>;
                            wot:fingerprint "0DF560B5DADF6D348CC99EA0FD76F60D4CAE10D7";
                            wot:hex_id "4CAE10D7";
                            wot:length 1024 ];

Which can be graphical represented as follows:

Building a web of Trust

From here is is easy to see how I can use the wot ontology to sign other files, link to my friends public signatures, sign their public signatures, how they could sign mine, etc, etc. and thereby create a cryptographically enhanced Web of Trust built on decentralised identity. Of course this is still a little complicated to put together by hand, but it should be really easy to automate, by incorporating it into the Beatnik Address Book for example. To illustrate this I will show I linked up to Dan Brickley and Tim Berners Lee's signature.

Dan Brickley has a foaf file which links to his public key and to the signature of his file. The relevant part of the graph is:

   <>     a foaf:PersonalProfileDocument;
               wot:assurance <> .

               foaf:pubkeyAddress <>

(foaf:pubkeyAddress is a relation that is not defined in foaf yet. Dan, one of the co-creators of foaf, is clearly experimenting here) Now with this information I can download Danbri's rdf, the public key and the signature and test that the document has not been tampered with. On the command line I do it like this:

bblfish$ gpg --import danbri.pubkey.asc
bblfish$ curl > danbri.foaf.rdf
bblfish$ curl > danbri.foaf.rdf.asc
bblfish$ gpg --verify danbri.foaf.rdf.asc danbri.foaf.rdf

Of course this assumes that someone has not broken onto his server and changed all those files. As it happens I chatted with Dan over Skype (which I got from his foaf file!) and he sent me his public key that way. It certainly felt like I had Dan on the other side of the line, so I trust this public key enough. But why not publish what public key I am relying on? Then Dan and others can know what signature I am using and correct me if I am wrong. So I added the following to my foaf file

:me foaf:knows    [ = <>;
                    a foaf:Person;
                    foaf:name  "Dan Brickley";
                    is wot:identity of
                          [ a wot:PubKey;
                            wot:pubkeyAddress <>;
                            wot:hex_id "B573B63A" ]
                  ] .

In the above I am linking to Dan's public key on his server, the one that might have been compromised. Note that I specify the wot:hex_id of the public key though. Finding another public key with the same hex would be tremendously difficult I am told. But who knows. I have not done the maths on this. But I can make it even more difficult by signing his public key with my key, and placing that signature on my server.

bblfish$ gpg -a --detach-sign danbri.pubkey.asc

Then I can make that signature public by linking to it from my foaf file

<> wot:assurance <danbri.pubkey.asc.asc> .

Now people who read my foaf file will know how I verify Dan's information, and detect if something does not fit between what I say and what Dan's says. If they then do the same when linking to my foaf file - that is mention in it where one is to find my public key, it's hash and sign my public key with their signature, placing that on their server - then anyone who would want to compromise any of us consistently to the outside world, would have to compromise all of the information on each of our servers consistently. As more people are added to the network, and link up to each other, the complexity of doing this grows exponentially. As there are more ways to tie information together there are more ways people can find the information, so the information becomes more valuable (as described in RDF and Metcalf's Law), and there are more ways to find inconsistencies, thereby making the information more reliable, thereby making it more valuable, and so on, and so on...

One can even make things more complicated for a wannabe hacker by placing other people's public keys one one's server, thereby duplicating information. Tim Berners Lee does not link to his public key, but I got it over irc, and published it on my web server. Then I can add that information to my foaf file:

:me foaf:knows [ = <>;
                    a foaf:Person;
                    foaf:name "Tim Berners Lee";
                    is wot:identity of
                           [ a wot:Pubkey;
                             wot:pubkeyAddress <timbl.pubkey.asc> ;
                             wot:hex_id "9FC3D57E" ];
                  ] .

By making public what public keys I use, I get the following benefits:

  • people can contact me and let me know if I am being taken for a ride,
  • it helps them locate public keys
  • the metadata associated with a public key grows
  • more people link into the cryptography network, making it more likely that more tools will build into this system

Encrypting parts of one's foaf file

Now that you have my public signature you can send me encrypted mail or other files using my public signature, verify signatures I place around the web and read files I encrypt with my private key. Of the files that I can encrypt one interesting one is my own foaf file. Of course encrypting the whole foaf file is not so helpful, as it breaks down the web of trust piece. But there may well be pieces of a foaf file that I would like to encrypt for only some people to see. This can be done quite easily as described in "Encrypting Foaf Files" [3]. Note that the encrypted file can be encrypted for a number of different users simultaneously. This would be simple to do, and would be of great help in making current practice available in an intelligent and coherent way to the semantic web. One thing I could do is encrypt it for all of my friend whose public key I know. I am not sure what kind of information I would want to do this with, but well, it's good to know its possible.


  1. This is how "PGP Signing FOAF Files" describes the procedure.
  2. The iana alternate ( relation is not dereferenceable. It would be very helpful if iana made ontologies for each of their relations available.
  3. Thanks to Stephen Livingstone for pointing that way in a reply to this blog post on the openid mailing list.
  4. A huge numbers of resources on the subject can be found on the Semantic Web Trust and Security Resource Guide, which I found via the Wikipedia Web of Trust page.
  5. This can be used to build a very simple but powerful web authentication protocol as described in my March 2008 article RDFAuth: sketch of a buzzword compliant authentication protocol

You know i have also often thought of why you would encrypt certain parts of data for specific users - i.e. using their public key. The challenge is that when NOT doing this, you really rely on some party enforcing the security that person A can't view certain details within your FOAF file - i can certainly imagine cases where i'd like to restrict who can see certain contacts.

When encrypting using their public key you need to store a copy of that encrypted data per key - but then you don't have to worry about site X maintaining that trust - that is done for you through PGP.

This work is great Henry - look forward to seeing it move forward. Would be neat to have some demo's using Java, C# and maybe PHP - i think it's important that the KISS principle is adopted - FOAF is hard enough for most users, but it would be hoped that most wouldn't have to do all the work you have done above to encrypt certain parts of their data ....

Posted by Steven Livingstone on August 10, 2007 at 09:25 AM CEST #

The Wikipedia Web of Trust page has been updated to include the work of Audun Jøsang, which seems relevant to some of this.

On a more personal note to Story and many others. While this area is useful and interesting, and I applaud your enthusiasm to "dive in there and get something done", I also advise you to devote more energy to finding out what else has already been done by others. It's not wise to develop specifications, code, or even concepts in isolation. That leads to the "Not Invented Here" syndrome.

Posted by Eric Norman on August 10, 2007 at 04:42 PM CEST #


Most of what I describe here is built on the work of others:

- the pgp web of trust was coined by Phil Zimmerman himself
- the semantic web, RDF and N3 are well established W3C standards
- foaf is a widely used ontology for describing people
- The WOT Ontology is a very light weight ontology developed by Dan Brickeley. This is just naming the concepts developed in PGP with URLs, so there is nothing revolutionary here at all.

All I have done above is just show practically how these pieces can be put together, as a way of helping me explore this very large field. It is indeed a very large field, as the reading list on the "Semantic Web Trust and Security Resource Guide" show. But exploration has to start somewhere. I am starting from the Semantic Web point of view.

Is there something in particular you would suggest I look at?

Posted by Henry Story on August 10, 2007 at 05:49 PM CEST #

It's not about building on the work of others; everyone does that. It's about needlessly building what's already built; it's about advancing the state of the art; it's about going where noone has gone before; it's about coordinating with everyone else in the same area; it's about the reason the phrase "Not Invented Here" even exists. It's nothing personal; the phenomenon is rampant among computer geeks.

Suggestions? There's a reason I updated Wikipedia.

Posted by Eric Norman on August 10, 2007 at 09:12 PM CEST #

While i agree with Eric, i think from experience, it's not so simple.

The problem as i see it is something in common with web "standards" and numerous "specificaitions" down the years. You start reading and discover half of them were dropped or weren't worked on for years and you've wasted days or work.

The ONLY way this could work is if there were some kind of primary community that many of the people in the Wikipedia listing were part of.

I couldn't see such a resource in the Wikipedia entry, which suggests fairly fragmented approach. Maybe something would be good to add... at least that way we could ask "has anyone done?" or "does anyone know?".

Posted by Steven Livingstone on August 11, 2007 at 01:42 AM CEST #

I would not be intimidated by the NIH comment. I view it as a symptom of what maintains the status quo, in military-motivated public key crypto and trust models. 15 years after public key went global, its still stuck in a mid-80s paradigm.

As someone with a 70s-era Marxist leaning once commented to me, the first line of defense is to throw the academics at a revolution - to build the case that revolution is not necessary, the speaker is evil, and the status quo should continue ...through the very process that means change in strong crypto usage by civil society stays within bounds defined by military dogma: centralized control.

What is interesting about the initiative is its lack of control plane elements. Having met lots of w3c folks, I know this is a deliberate political act, one tied to the self-replicating and self-extending nature of the web phenomenon itself.

Accept what an internet founder once articulated: the Internet is an amplifier. Now, show up why the control dynamics are omni-present in traditional TTP-dominated telco, and let the net amplify the fears that underlie that policy. The web will then instrument a response.

Posted by Peter Williams on August 11, 2007 at 11:50 AM CEST #

I can now see the use of Jeremy Caroll's algorithm described in his paper "Signing RDF Graphs" available at

The point is that if a foaf file signs itself then the reader of it will not know about the signature until after having read the file, by which time it may well have thrown away the byte stream that was signed. So it helps to sign the foaf graph itself, which would also remove the need to have a signature for the n3 and the rdf serializations.

So it looks like there is a useful relation to be added here to the wot ontology.

Posted by Henry Story on August 17, 2007 at 11:38 AM CEST #

Hi Henry, I like your approach of using UML to graphically represent RDF/OWL. I was wondering how would you represent sub-property relationships (e.g. foaf:knows with its descendent rel properties) since UML does not seem to have a standard way to do that. What is your stylistic preference?

Posted by Revi on September 10, 2007 at 02:33 PM CEST #

Revi: For subproperty relations I just use an inheritance arrow between two properties. That's what I did here

It makes sense there I think...

Posted by Henry Story on September 10, 2007 at 03:15 PM CEST #

simple! thanks!

Posted by Revi on September 10, 2007 at 03:27 PM CEST #

[Trackback] Henry Story: What is needed is some way to increase the methods people can have to trust information I state. Here I describe how one can use cryptography to increase that trust level. Starting from my creation of a PGP key, I show how I can describe m...

Posted by Stefan Tilkov's Random Stuff on September 16, 2007 at 04:37 PM CEST #

One problem with PGP I just came across is that if one looses one's password one is, I think, stuck. One may not be able to change one's public key for a given email address. Now in my case very few people have my PGP key, so it will be ok to change it, but I can see that this may have slowed down adoption. If the PGP key had a pointer to the foaf file, then it would be possible to update the pointer in the foaf file to point to the latest PGP key, and so get out of tricky situations like this.

Also here is an interesting article showing how one can get very far without encryption, using just whitelisting technologies built on foaf and openid:

Posted by Henry Story on December 12, 2007 at 01:58 PM CET #

I must have been a little tired yesterday when I forgot my password. That is I think only a problem when one has encrypted some serious content with a key. Then one is stuck. Otherwise it is just a matter of recreating a key.

Posted by Henry Story on December 13, 2007 at 05:14 AM CET # is an integrated social network as a web of trust based on PGP Keys, maybe you are interested to have a look and to discuss using the PGP Key with the friends from this app as a web of trust.

Posted by Me on January 10, 2008 at 10:09 PM CET #

Maybe dct:hasFormat property is more appropriate in this context. It's not quite a subproperty of iana:alternate, but:

@prefix dct: <> .
@prefix iana: <> .

{ ?x dct:hasFormat ?y } => { ?x iana:alternate ?y } .

Posted by Andrey Nordin on April 20, 2008 at 05:19 PM CEST #

I have removed the wot stuff from my foaf file now, as I think one can get the same effect more cleanly with foaf+ssl:

Signing one's own foaf file is good, signing other's files and certs is also good, but one needs more infrastructure to do this correctly. Perhaps not much it is true, perhaps just a few simple scripts. But for this to catch on one needs good practical, easy to implement use cases, which just seem to come easier with foaf+ssl. See for example:

The wot ontology also is a bit too tightly and unecessarily linked to pgp. It is better to build on the underlying crypto algorithms that appear in all these standards, namely rsa, dsa, etc... So with the rsa ontology one can easily publish one's public key, but in a way that makes it possible to work with X509 as well as with PGP . You will now find the crypto: and rsa: name spaces in my foaf file.

The core of the wot ontology is right though, and it inspired me to look further. Well this is where I am at now. We'll see how this pans out. So sorry if some of the links on this page no longer reflect the text in this blog....

Posted by Henry Story on January 02, 2009 at 07:04 AM CET #

Post a Comment:
Comments are closed for this entry.



« July 2016