Can you tell us a bit about how you grew up? The power of your story, in a time of change

This month I took part in a slightly unusual interview. The interviewer was from a publication called Authority and the interview was carried out entirely via email, but that's not the unusual bit. The title of this interview was: 5 Things You Need To Know To Optimize Your Company’s Approach to Data Privacy and Cybersecurity. Again, nothing unusual there—I have spent several decades studying how companies approach data privacy and cybersecurity. But consider the very first question of the interview: Can you tell us a bit about how you grew up?

Six year-old me, with my father,
an engineer, in Canada, 1959

That may not be an unusual question if you were being profiled by a lifestyle magazine, but as a prelude to professional opinions on cybersecurity? To me, that was unusual.

However, as I thought about my response—words that would truthfully answer the question while remaining relevant to the context—I not only enjoyed the process, I realized that this was a question I'd been discussing with myself for decades. 

Furthermore, across those decades, the answer has changed, many times. Indeed, the answer to "how I grew up" was often a story of both origins and change, a way to make sense of how my life started out and then turned out. And of course I have told that story many times, in job applications and interviews, at business dinners and networking events, and on the Internet via websites and social media profiles. 

I don't know how you feel about making sense of your life, but I have found that having a coherent personal narrative of my life has helped me to cope with some of the tough times that I've had to live through, mercifully few though those have been. (I am well aware that I have enjoyed exceptional good fortune in life and, as a white male, a massive amount of privilege; however, I have had to face grief and loss, prejudice and enmity, and I know from personal experience what it is like to be unemployed and homeless.) 

Your story of change

Speaking of tougher times, 2020 seems to be determined to bring more of these to more people in more places than any other year since the 1940s. During the ongoing Covid-19 upheaval I have found myself advising several people whose lives and careers are now—for a variety of reasons—in a period of involuntary transition.

However, because I am not a professional career counsellor or life coach, I felt obliged to bolster my own advice with that of experts. Fortunately, I found this very relevant perspective:

When you’re in the midst of a major career change, telling stories about your professional self can inspire others’ belief in your character and in your capacity to take a leap and land on your feet. 

This appears in an article titled What’s Your Story? by Herminia Ibarra and Kent Lineback that was originally published in the Harvard Business Review magazine. While the article was written over 15 years ago, it remains 100% relevant to 2020. Both authors are considered experts in their field with books to prove it. 

According to Lineback's profile on Amazon: "he helps companies and executives tell their stories, so others can learn from their experiences." Ibarra is an organisational behaviour professor at London Business School and offers lots of organizational and personal development resources on her website; she is also active on Twitter as @HerminiaIbarra

The authors begin their discussion of "Why You Need a Story" with this observation:

"All of us tell stories about ourselves. Stories define us. To know someone well is to know her story..."

However, and to the point of this blog post, they continue: 

"Seldom is a good story so needed, though, as when a major change of professional direction is under way...In a time of such unsettling transition, telling a compelling story to coworkers, bosses, friends, or family—or strangers in a conference room—inspires belief in our motives, character, and capacity to reach the goals we’ve set."

If you are dealing with an unsettling transition right now, I strongly urge you to read What's Your Story. And if you are hesitant about the idea of "telling stories," the authors make it clear that: 

"In urging the use of effective narrative, we’re not opening the door to tall tales. By "story" we don’t mean "something made up to make a bad situation look good." We’re talking about accounts that are deeply true and so engaging that listeners feel they have a stake in our success."

Personally, I have been very fortunate to have a lot of time to think about my life this year, and I now see that in the past my career benefited greatly from discussing—with myself and others—factual accounts of my life that are both "deeply true" and "engaging." 

In a 2018 TEDx talk, Ibarra refers to her work as teaching and researching people who come to those points in life that she calls: "what got you here won't get you there moments." I think most of us have experienced moments like that, even before 2020. I hope her article, and the other resources that I have pointed to in this blog post, prove helpful to you in getting through such moments now and in the future.

My story of change

Allow me to close with my version of "a bit about how you grew up" that appeared in Authority, the online publication which uses this tag line: Top Lessons. Top Authorities. Authority is published on the Medium platform, which I have used a few times myself—like this story about lack of trust in tech companies—but Authority uses Medium at scale. I think at least a dozen other people were interviewed with the same set of questions. You can read the full interview here, but the following is the bit about how I grew up:

I have spent much of my adult life in the US but was born and raised in Coventry, England, a city synonymous with innovations in industrial technology, like the pedal chain bicycle and the turbojet engine, and manufacturers like Jaguar, Land Rover, and Triumph. My father was an engineer, as were my grandfathers. As a teenager in the 60s I aspired to be a celebrated poet and songwriter, but the oil crisis of 1973 crushed funding for the arts and I pivoted into petroleum accounting, tax auditing, and from there to computing; that’s how I became enthralled by the clash of technology and ethics that is at the heart of cybersecurity.

I hope that gives you a sense of who I am, how I got to be who I am, and some of the changes I went through to make a career of studying how humans create and confront technology risks.

Just time for a quick update


You may recognize the phrase "Just time for a quick update" from John Oliver's show "Last Week Tonight With John Oliver." Like me, John is a dual national (UK/US) who was born in the UK, in the part that is called The Midlands.

John was born in the city of Birmingham. I was born in the city of Coventry. These two cities are close together but have remained separated by about seven miles of protected green space thanks to some sensible planning here in the Midlands. (Note, it is not called the Midlands because it is in the middle of the UK, it's not, it's in the middle of England, which is one of the four "regions" that make up the United Kingdom—it's complicated.)

A year ago today, I arrived here in the Midlands from America, with my partner, Chey, to explore a possible future in which we could be closer to my mum—who turned 90 in 2019—and my brother and his wife. Mum was born and raised and still lives in the Midlands. My brother and his wife now live in Spain. 

Less than six months into this experiment, the parameters changed: Coronavirus created a whole new set of variables, including restrictions on our ability to go to Spain or back to America or pretty much anywhere. 

Obviously, no "quick update" can capture the many and varied implications of all this, but fortunately I can point you to some of the things I have been doing during this time, namely research and writing on malware, cybercrime, cybersecurity, and a worrying lack of trust in tech firms

I will try to share some of the details of our ongoing experiment as time permits, mainly in the hope of helping others who may been dealing with some of the same challenges we have faced, but also some of the joys we have encountered, like the view at the top of this post. That's what the way to my mum's house looks like on a sunny day in late summer, early autumn.


Brexit: 11 p.m. GMT on 31 January 2020

[UPDATE: 00.01 on 1 January, 2021 — The UK completed it's departure from the EU. A bad idea has now become a bad reality, IMHO.]


I always thought that joining the ECC/EU was good for the UK.

I always thought that leaving the ECC/EU would be bad for the UK.

I am not happy that Brexit is happening. Period. Full stop.

No, seriously, that is the whole article. Nothing more to read. Too sad and angry to write any more.

What Am I Thankful For? A diagnosis of congenital amusia

In November of 2008 I wrote: "we’ve arrived at the time of the year when it’s traditional to speak of things for which we’re thankful, I figured I would put it like this: I am thankful for a diagnosis, even though that diagnosis is hemochromatosis." Now I'm back with thanks for another diagnosis, one that thankfully does not involve physical pain and suffering, although it has had quite an impact on my life.

The difference a name makes

It was my partner, Chey Cobb, who received that diagnosis of hemochromatosis. The thankfulness we felt at getting this diagnosis came from having a name for the constellation of symptoms that had forced her to quit working and turned her daily life into a daily struggle (one that, sadly, has continued to this day). We were both surprised by what a difference it makes to have a name for the suffering you've been going through.

As inveterate researchers, we saw Chey's diagnosis as a starting point for exploring treatment options, finding support groups, and lobbying policy-makers. I started a Facebook page and website to raise awareness of hemochromatosis, which is widely under-diagnosed and not well understood by many doctors. We personally validated a CDC study that found the average time to get one's hemochromatosis correctly diagnosed was nine years, enough time for the condition to cause irreversible damage to joints, liver, heart, brain, kidneys, and other organs.

Sadly, we saw a replay of this diagnosis phenomenon three years ago when doctors confirmed our daughter's suspicions that she had Multiple Sclerosis (MS). The day she got that confirmation she called us in state akin to elation, tinged with validation, even though she knew all too well that the road ahead was going to be a very tough one. But we understood how much it meant to have a name for what you've got.

Now hear this

When you get a medical diagnosis, particularly one that's taken many years to obtain, there are two phrases that are likely to come to mind right away: "that explains a lot" and "I knew I wasn't imagining things." (The latter is likely to be familiar to female readers - numerous studies show that the tradition of doctors telling women their symptoms are "all in your head" is still a thing.)

The diagnosis that I am thankful for today "ticks all the boxes" as they say in England: it explains a lot, and it validates a whole bunch of thoughts and feelings I've had since December, 1959. That's when, during rehearsals for the school Christmas concert, I first learned of the problem for which I now have a diagnosis: congenital amusia.

Technically, "a deficit in fine-grained pitch discrimination," what I have is sometimes called "tin ear." Indeed, what the teacher said to seven year-old me was: "Stephen Cobb, stop singing, you have a tin ear." What Mrs. Ashby did not know, and I have only just learned, is that I was born that way. In other words, congenital amusia means that I have always been, from birth, somewhat tone deaf.

(I don't want to go into detail about the congenital amusia in this article - I put together the 4amusia website for more information - but studies show that 4% of people have this disorder. My particular form of amusia is not severe, it doesn't mean I don't enjoy music, and I don't lack a sense of rhythm; but, regardless of how hard I try, I can't sing or learn a musical instrument - my brain lacks something in the pitch processing and retention department.)

What I am so thankful for today is the knowledge that my inability to carry a tune or learn a musical instrument is not due to laziness, sloth, or weakness of character - qualities of which I, and many other people with my condition, are routinely accused. I am so grateful that I can now say, with scientific certainty, that those accusations were inappropriate.

Lingering effects

I'm sure I could write a whole chapter about how much it hurt to suffer those accusations, the self-recrimination and doubt that it induced. I know I could have done without the castigation of teachers who were sure I could learn to play the recorder - a rite of passage in English schools of the 1950s and 60s - if only I would apply myself.

Then there's the chapter on how frustrating it was to grow up in the sixties with a strong poetic streak but no ability to voice the songs I composed, not to mention fruitless hours failing to learn guitar. Sure, I could pose for the album cover, but I was never going to be on the album.

But today I'd much rather give thanks for the unexpected gift of this diagnosis: the empathy it has given me for this thing called neurodiversity, the growing realization that human beings are not all wired the same way.

While I realized long ago that organizational aversion to people who are "different" is bad for organizations, and bad for "differently-abled" people who can bring great insight and real value to any mission, I have to admit that I didn't truly 'get' neurodiversity until I learned that my own brain had a wiring issue.

And as I look at what is happening today in terms of research, it strikes me that there is great potential for humans to learn more about the many different ways in which we are wired. These days a decent school is going to recognize something like dyslexia at an early age and respond appropriately. Hopefully, schools will soon be recognizing that some children don't hear pitch the same way most people do.

While I sometimes get quite emotional about this topic, let me be clear that knowing more about neurodiversity isn't just about people feeling better about themselves, it has seriously practical implications. Knowing the ways in which you are different makes you better able to be the way you are, and it sometimes happens that there are benefits to being wired differently. Society is better off as a whole if we can see that, and go with it.

*With a huge thanks to those scientists who believed people when they said "my failure to learn an instrument was not for lack of effort."

It's official! I'm making some big changes

I have retired from my corporate position and we're moving to England!

After many enjoyable years with ESET—the organization I've worked for longer than any other—I began to think it was time to change things up a little, or down a notch, depending on your perspective.

And I knew that—owing to several factors on which I will elaborate later—the change would involve a move. So we began to look at living somewhere other than San Diego.

When Chey and I went to the UK earlier this year—for my mum's 90th birthday—we arrived at the conclusion that we would like to move closer to her. We now plan to complete our relocation by early September, to a cozy place just a short walk from mum's flat in Coventry, the thousand year old city in which I was born. And when we've unpacked and the dust settles, I expect to be sitting in a comfy chair in small study with a big internet pipe, conducting independent research into the darker aspects of humans and technology.

I will probably reemerge as Stephen Cobb, Independent Researcher. Down the road it could be Stephen Cobb, Public-Interest Technologist. (And I wouldn't rule out Prof. Cobb since Coventry has two thriving universities and there are several more nearby, including my alma mater, the University of Leicester).

What? When?

Timing is not always everything, but it did play a big role in this set of changes. By the end of 2018 I had reached a point in time that is referred to in America as "full retirement age." This is when Americans can start receiving the full amount of their pension (if you were born in 1952, that age is currently 66). What I mean by "pension" is Social Security retirement benefit, but we decided to use the term pension because in England "social security" means something quite different.

As 2018 unfolded I began see this pension as a "social retainer," a way for me to finance a different approach to my life's work, a chance to labor at my own speed, in my own way. I will write more about that work in a different place, but suffice to say it involves - among other things - helping the world to "enjoy safer technology." As you may know, that phrase is how ESET - my former employer - frames its mission, and it's one reason that I worked there so long.

I realized that a pension potentially means being able to choose my own strategy - like writing a book to give substance to the points I want to make, or making those points as an independent voice, not someone employed by a corporate entity (to be clear, ESET had an admirable commitment to objective research and required me to stay "vendor-neutral" in my public speaking - but one ethical company cannot save the reputation of an industry that needs redeeming).

But why did I say: "a pension potentially means being able"? Well, the enabling power of a pension is dependent on the size of that monthly check from the government relative to the cost of living where you live. Exactly how dependent will vary based on your circumstances. All of which turns out to be quite relevant to our decision to move to Coventry in England, as I will now explain.

How much?

The "Too Long, Didn't Read" version is that the pension checks which Chey and I started to receive this year are not enough to live on in San Diego given that we don't own a home here. We are members of a fairly large group of people whose assets were wiped out by the Great Recession, so we entered this decade with no savings and no home of our own.

Since 2011, we have lived in rented property in San Diego, where the average rent is now over $2,000 a month. When we moved here we decided to live near the ESET building in Little Italy so that I could walk to work (which costs a lot less than driving, with way less stress). You pay a premium for this location but sadly, Little Italy has become less of a community in recent years, and more of an entertainment district. We have felt it grow less livable even as it has become less affordable, providing additional incentive to move from our current location. (After dozens of moves in the nearly five decades since I left home, I've come to see moving across the country or over the ocean to be no more of a pain than moving across town.)

Last year, rents in San Diego as a whole rose 7%, and the average monthly rent in Little Italy is now over $2,400, and still rising. We pay slightly more than that, for a decidedly smaller place than the one we rented for $1,750 when we first moved here in 2011. So, unless you already own property in San Diego, or have managed to accumulate and retain a large nest egg, the prospect of retirement here, however appealing it might seem, is economically infeasible.

Being researchers, we analyzed numerous "more affordable" places after our nest egg was cracked by the Big Bank Fraud (then smashed by the Great Recession and mopped up by the for-profit healthcare industry). Turns out we can live in a nice house in Coventry for less than half what we currently pay in Little Italy. True, Coventry has less than half the number of sunshine hours you get in San Diego, and twice as much rain, but our pensions should be enough to pay the bills plus occasional flights to see my brother in Spain, while keeping us in wax jackets and wellies to boot.

The changes we are making this year have already taught us a lot and as our journey continues I will endeavor to share what we discover along the way. In the meantime, I will be tweeting as @zcobb if you'd like to follow me there.

23andMe and Hemochromatosis


This blog post is a place holder related to a conversation that started back in 2016 when someone wrote to me, as follows:
I read your blog regarding Hemochromatosis and decided to look further into the 23andMe test. They tell me that their test results do not report on HFE. Do you know if this is a recent change with their testing or am I missing something? Below is the email correspondence I had with 23andMe [not reproduced here]Do current 23andMe test results show C282Y, H63D and S65C mutations? If so, where do I find this information in the reports?
The question was addressed to me because I had been researching hereditary hemochromatosis due to my partner's condition: hereditary hemochromatosis. This is due to a genetic mutation (HFE) which can cause the body to mishandle iron intake. This can lead to excess iron in your joints and soft tissue, an affliction known as iron overload. If not treated and managed, iron overload can cause permanent damage and may prove fatal. 

We had both been early customers of the 23andMe genetic testing service. Back then it was possible to get information about one's HFE status (known by codes like C282Y, H63D and S65C). However, in 2013 the FDA took issue with 23andMe and censored access to this data. (Some of the background to this, from 23andMe's perspective, is here.)

In response the FDA restrictions, people found a way to extract the HFE data from the raw 23andMe genetic data (to which the FDA did not bar access). That was the situation in 2016 when I received the inquiry cited at the top of this article. However, in 2017, the FDA allowed 23andMe to resume the provision of HFE results (as described here).

The bottom line is that the set of instructions that I wrote up in 2014, documenting the workaround to determine HFE status from the raw data, is no longer needed.


High blood pressure cure? For some, this treatment is not a conn

Short Version/TLDR: I used to have high blood pressure or HBP. Now I don't. 
  • If you have HBP and low potassium, check out Conn's syndrome. 
  • If you have Conn's syndrome, an operation can fix it. 
  • I had the op in 2013 when my BP was 150/100 while on HBP meds. 
  • At the end of 2013 it was 120/70 w/out meds, and it still is.

Why am I re-sharing this information?

I wrote about my experience with Conn's syndrome back in 2013. This blog post is simply a re-sharing of what I wrote back then (with one new piece of data at the end).

Why am I doing this? Every time I hear a person say "I have high blood pressure" or HBP, my thoughts go like this:
  • I know what HBP is like.
  • HBP is not very nice.
  • HBP can shorten your life.
  • I am extremely fortunate that I don't have HBP any more.
  • Should I tell this person about Conn's syndrome?
Of course, the answer to that question depends on a range of variables: who is the person saying they have high blood pressure? Where is this being said? Do I know this person? I try to weigh these variables before speaking, but as people who know me will tell you, I tend to err on the side of speaking up, sometimes to strangers. I also have a tendency to speak up about some things that other people might prefer to keep private.

However, a fair number of people have thanked me for sharing the story of my battle with high blood pressure because they found it helpful. And that is why I wrote about my experience, so people could "read all about it" if they wanted to, rather than listen to me talk about it. Also, I could refer people to my blog if there was not the time or inclination to go into details in person.

So here are the relevant blog posts in historical order (as in earliest first - I am not suggesting that these articles are 'historic'):

What now?

I always intended to write one more blog post on this topic, documenting the long-term prognosis and perhaps adding some references. I guess this is that 'one more' blog post. Sadly, I don't have time to do a full reference list but this article on Conn's syndrome is quite helpful, as is this more technical paper).

My sense from reading the literature is that there will be many more cases like mine: people cured of their HBP, often after years of being told that their HBP had no known cause and they just weren't eating and living right. These people will be identified by: [a] continual improvements in ultrasonography (US), computed tomography (CT), and magnetic resonance imaging (MRI); and hopefully [b] greater awareness of Conn's syndrome.

The summer of 2018 marked the five year anniversary of my operation and return to 'normal' blood pressure without drugs. It has been a busy five years. I started a masters degree in late 2014 and graduated in early 2017, all while carrying a very full workload (from an employer wise enough to subsidize graduate school tuition).

For the most part I have felt pretty healthy. I have had some issues with my digestive system and I sometimes wonder if that is a lingering side effect, not of the adrenalectomy itself, but the infection I got during my hospital stay. 

Nevertheless, that operation was well worth it and I feel very fortunate that—thanks again to a wise employer—my health insurance covered it. I am reminded that it is in the national interest for everyone to have access to affordable healthcare, so that the negative economic impact of conditions like HBP can be reduced by more efficient diagnosis and treatment.

The sting in the tail

An update from late 2019: my adrenalectomy did not cure my atrial fibrillation, which was probably caused by the excess aldosterone in my body during all those years in which my primary aldosteronism went undiagnosed. Sadly, "the current diagnosis of primary aldosteronism is suboptimal–its delayed diagnosis results in end-organ damage that requires complex management...an increased awareness of primary aldosteronism is required in both primary and tertiary care so that an earlier diagnosis can be made for optimal patient outcomes." That's from a 2018 article in the Australian Journal of General Practice published by the Royal Australian College of General Practitioners (here's a link to the article).

All the more reason to let more people with high blood pressure know about Conn's syndrome, so they can ask their doctors to investigate, before excess aldosterone has a chance to do damage.

What's this #HeForShe thing?



Technically speaking, #HeForShe is a hashtag, a social media tool defined as: "a word or phrase preceded by a hash or pound sign (#) and used to identify messages on a specific topic (Wikipedia).

About two years ago I started adding the #HeForShe hashtag to things like the "Welcome to CobbsBlog" page and my Twitter profiles (@zcobb and @thestephencobb). The #HeForShe hashtag originated with, and is the name of, the UN Women’s solidarity movement for gender equality.

The idea behind HeForShe is that it: "invites men and boys to build on the work of the women’s movement as equal partners, crafting and implementing a shared vision of gender equality that will benefit all of humanity."

Tagging things #HeForShe is a way for me to share the fact that I have accepted that invitation. Why? Because I truly believe that gender equality does benefit all of humanity. I also believe that gender equality will not be achieved unless more men - most men, all men - commit to it, and make it a priority, in practical terms and not just as a vague aspiration.

Getting schooled on #HeForShe


I came to know about #HeForShe because I was studying at the University of Leicester when, back in May of 2015, it joined the UN Women’s HeForShe solidarity movement as an IMPACT 10x10x10 champion, one of 10 universities around the world participating in the program with the goal of taking "bold, game-changing action to achieve gender equality within and beyond their institutions."
"Announced at the World Economic Forum in Davos, Switzerland, in January of 2015, HeForShe’s IMPACT 10x10x10 programme engages 30 key leaders across three sectors—the public sector, private sector and academia. All 30 IMPACT champions have made common commitments and have also developed tailored commitments, formally reviewed by an expert team at UN Women and approved personally by the Executive Director of UN Women, Phumzile Mlambo-Ngcuka."

But the fact that my school had embraced HeForShe was not why I chose to do so. I honestly feel that gender equality has always been something that I believe in, from well before my first stint at university (University of Leeds, 1971-74). I can't say that I was born a feminist - the scientific jury is out on whether that is even possible - but I knew that I was a feminist-sympathizer as soon as I heard the word used in a sentence. That would have been around 1965, shortly after I became a teenager and read The Feminine Mystique.

Here's what happened: about that time my mum enrolled in college under a government program to reduce the shortage of teachers created by the baby boom. Her decision - which my dad supported practically, emotionally, and philosophically - resulted in a real world experience of gender equality in action. Among other things it demonstrated that:

  1. Women can have a productive career outside the home.

  2. This is not a threat to men.

  3. Men and boys can do housework quite well.

On top of that, mum's time as a mature student created a steady flow of interesting books into our house, notably the afore-mentioned 1964 classic, The Feminine Mystique, by Betty Friedan. This has since been "widely credited with sparking the beginning of second-wave feminism." As I read - entirely of my own volition - Friedan's analysis of women frustrated with society's narrow and deeply limiting definition of what a woman should be - wife, mother, cook, cleaner - it rang true with my own observations.

That's right, I had - for whatever reason - been observing women from an early age (maybe I was born to be social scientist). As a child I was surrounded by women, at home, at church, and at the shops. I listened to them talking. I read women's letters to the advice columns in ladies' magazines (which were definitely not feminist back then).

Rather fortuitously, my childhood in Coventry, England, was enriched by frequent visits from numerous aunts and great aunts, all of whom had all survived at least one world war. My mum's mother had actually lived through aerial attacks in both World War One and World War Two. All of them had lived through large-scale bombing campaigns, including the one in 1940 that killed over 500 people in Coventry in one night and destroyed two-thirds of the city's buildings (Wikipedia). My grandma and several of her sisters worked in munitions factories which were targeted in these campaigns.

Often when I was small these women, most of them housewives with grown children, would sit and talk about those times gone by, and I would quietly listen at their feet. That is how I came by precious historical vignettes like this: my Great Aunt Tot standing in the middle of the street shaking her fist and swearing at a German Messerschmitt 109 as it made a daylight strafing run on the factory at the end of the road.

So maybe it is not surprising that I grew up thinking of women as strong, independent individuals; all the while growing increasingly angry that society would not treat them equally. Yes, there has been some progress, but nowhere near enough. Hopefully #HeForShe can help us move things forward.

Of allies, male feminists, and good men


I hope to find time to write more about HeForShe but in the meantime I will try to use the hashtag wherever appropriate in order to raise awareness of gender inequality and the need for men to work to eliminate it.

What I will try to avoid is referring to myself as an ally of women, or a male feminist, or a good man. Those are designations to which I aspire, but it is not part to claim them.

Will "repeal and replace" hurt genomic medicine and victims of genetic conditions?

Let me give you the short version of my answer up front: Yes. If the current privacy protection for genetic medicine in the US, in which Obamacare/ACA has played a key role, is diminished by the "repeal and replace" efforts of the current US administration, then America's hopes for genomic medicine will also be diminished. Victims of some genetic conditions will be particularly hard hit, as will all forms of research that involve the human genome.

The even shorter version goes like this: Why would I give anyone my genetic information if that might lead to myself and my family being denied insurance or paying higher premiums, for medical, life, or longterm care policies?

brian0918, Public domain, via Wikimedia Commons
Fans of genomic medicine are apt to respond by saying there's no need to worry because there are laws to prevent that type of discrimination. To which I have heard many people say: I don't trust the insurance companies and/or the government to abide by those laws. And besides, laws can be repealed, and databases can be hacked.

In short, when it comes to enjoying the benefits of medical science, Americans face a bleaker future than the residents of other wealthy countries due to the absence of two rights: the right to health care and the right to privacy.

Background

Who am I to present these arguments? For more than 25 years I've been studying information security, data privacy, and risk. I've been a Certified Information System Security Professional for more than two decades and I have a Master of Science degree in Security and Risk Management. I have also put in more than a decade as primary caregiver for someone with a genetic illness (variously known as hereditary hemochromatosis, genetic haemochromatosis, Celtic Curse, Bronze Diabetes, Iron Overload). In that role I have spent many years interacting with the families of hemochromatosis patients and the main support group for this condition, the Iron Disorders Institute.

What is the problem? The House recently passed legislation called the American Health Care Act of 2017 (H.R. 1628). There is a Senate version known as the Better Care Reconciliation Act of 2017. As far as I know, both of these pieces of legislation remove a gene-related provision of the current law, ACA (a.k.a. Obamacare). Here's the problem:
  1. The Genetic Information Nondiscrimination Act of 2008 a.k.a. GINA says employers and health insurers can't use your genetic data in hiring decisions and health insurance coverage; but, as Maryam Zaringhalam at Slate points out: life, disability, and long-term care insurance are not covered under GINA’s provisions, and those insurers "already use genetic testing results to deny coverage to otherwise healthy individuals".
  2. Furthermore, GINA only protects people who are genetically predisposed to a disease as long as they are asymptomatic. In other words: "once a person begins showing symptoms, GINA no longer matters" (Zaringhalam- see link in References below). For example, my wife was born with the HFE mutation that can produce a potentially fatal condition known as iron overload but she was asymptomatic for the first few decades of her life. Then, in her forties, due a phenomenon dubbed hemopause, she became increasingly symptomatic. She is now eminently "declinable" under pre-Obamacare rules.
  3. This GINA "loophole" as Zaringhalam calls it, was closed by Obamacare. That's because the ACA outlawed discrimination in health care insurance pricing or coverage based on preexisting conditions.
  4. Now the current administration looks set to return America to the days when preexisting conditions were considered grounds for charging higher insurance premiums.
  5. That would mean returning health insurance to the list of things you pay more for if your insurer has knowledge of your genes. Remember, that list already includes life, disability, and long-term care insurance.
I would be the first to admit that the above is a simplified account of the problem, but I stand by its accuracy and will go into more detail below. A complicating, and possibly offsetting factor in this story is the plethora of state laws on genetic data, medical privacy, and health insurance. Those might give you hope, but then you have to factor in the rampant hacking of supposedly private databases of personal and medical information that we have witnessed over the past few years. Bottom line? It is not hard to understand a response of "No way!" when you suggest to someone that they should get their genes tested, even when that test could potentially save their life, or those of their relatives.

There was no valedictorian and other observations on the way to my graduation

Last month I graduated from the Criminology Department of the University of Leicester with a Master of Science degree in Security and Risk Management (MSc SRM). I graduated in person, in England, with my own two-person cheering section (mum: Dorothy; partner: Chey).

The trip to get there was a long one, and I don't just mean the miles (6,000) or the years (two spent on the course, but many more getting ready for it). However, the journey was well worth making, and the graduation ceremony was well worth attending, even though it raised several questions that I feel obliged to answer here.

1. Why graduate in January?


The timing of my graduation ceremony was awkward to say the least, but it was due to the fact that the SRM program that I wanted to pursue has two cohorts per year, commencing in March and September, with two graduation ceremonies, July and January. I was in a September cohort for which the usual graduation is January.

That is not, in itself awkward, just unappealing, given how cold and grey January weather can be in England (for the photo of Chey and me on the right I had to crank up the Brightness).

But the exact timing was awkward, given that my employer, ESET, whose generous employee education program had funded my studies, decided to hold its annual North American Partner Conference (NAPC) that same week as my graduation.

The NAPC is a great event, hosted at the San Diego Hard Rock Hotel, and as head of the US Research Team I was expected to address the partners on the 2017 cybersecurity threatscape, the world into which they would be selling ESET's security solutions in the months ahead.

Fortunately, it was possible for me to do that, and go to the graduation, by speaking before lunch on the first day of the conference and then taking the direct BA flight from SAN to LHR later that afternoon. Unfortunately, that meant getting to our UK home base of Coventry late in the afternoon of the next day,  checking into a hotel, having dinner with Mum, and then rising next morning to head for Leicester. Not a lot of time to get over jet lag, but it was do-able.

2. Second or third masters degree?


At the end of my remarks to the NAPC I apologized for not being able to hang around for the whole two day event, making a joke about having to go and get my degree because the university refused to change the graduation date to accommodate ESET, even though it's one of the largest security software companies in the world.

That got a few laughs, but it's what I got over lunch that surprised me: questions about whether this was my second or third masters degree, or more generally: "How many degrees is that then Professor Cobb?"

I can honestly say my initial reaction was entirely factual: I said that this was my first masters, two degrees total. Some people obviously assumed I had spent a lot more time in academia than is the case. But I had to chuckle when I told my classmates about this at our department's pre-graduation buffet, because they all said they would have played along with the assumption: "Second or third masters degree? Hmm, let's see, hard to keep track."

Of course, my fellow graduands were all security people, many working in physical and operational security, and this accustomed to the odd piece of, shall we say, tactical social engineering. And for some of them this was their first degree, since it is possible to do a Masters degree in England without a Bachelors or, as in my case, without a relevant Bachelors. My first degree, back in the 1970s, was in English and Religious Studies (and the number computers involved was zero).

A big motivating factor in attending my second graduation is that I skipped my first one. Why? I was boycotting the royal family. Allow me to explain. I have always objected to monarchy and my first degree would have been handed to me by the Chancellor of the University of Leeds, a position held at the time by a member of the British royal family.

I did not think that was appropriate and I did not want her handing me my degree. At the time, this posed something of a dilemma for my mum, seen here on the right. As far as we knew, I was the first person in our family to get a degree, so it was definitely something to celebrate, but on the other hand, my mum and dad had raised me to stick by my principles, on top of which, they weren't fans of the royal family either.

In the end we compromised and I a posed for some suitably formal picture taking in my grandparents' garden, wearing the appropriate gown from a Leeds alum who was a friend of the family. (My grandfather might not have had a degree, but by the time he was 50 he was able to sell his share of an engineering firm in Coventry that he co-founded, and retire with a garden large enough for a bowling green and graduation pictures.)

3. Isn't that against the rules?


In America, the rules of academic hierarchy tend to be strict. For example, you will have a hard time getting a paid teaching gig at a US university if you don't have a masters degree. But rules can be bent at times, for example when a new discipline emerges. There was a time, not much more than a decade ago, when you couldn't hire someone with a computer security degree to teach computer security because such degrees did not exist.

This led to an interesting exchange when I was being interviewed for my job at ESET in 2011. The head of HR, who has since become a good friend, said to me: "Your resumé indicates that you taught master of science in information assurance classes at Norwich University, but how was that possible when you only have a bachelors degree?" To which I replied, "Well spotted! It was only possible because the Dean made an exception, based on my knowledge and experience."

In fact, the award-winning MSIA program at Norwich, created in 2002, was put together by someone with a PhD in applied statistics and invertebrate zoology, Dr. Mich Kabay. To create and deliver the program's online curriculum, Mich tapped myself and Chey and a small army of security industry experts, none of whom - to the best of my knowledge - had a degree in security at the time. His approach paid off in short order as Norwich was quickly named a Center of Academic Excellence in Information Assurance Education (referred to as COE for short) by the NSA's Deputy Director for Information Systems Security.

I was initially surprised that people assumed I had multiple degrees, and then I felt flattered. I decided it meant that they think I know what I'm talking about. And that is actually true most of the time: I do try to talk only about what I know, or at the very least, to provide a clear disclaimer when I'm asked, or tempted, to talk about something that I'm not sure about.

Over the years folks have occasionally referred to me as Doctor Cobb, and I have immediately pushed back. I do not have a doctorate, even now. But I am less concerned when folks call me Professor Cobb. I have taught at university, and may do so again at some point. However, and just to be clear, I currently only have two degrees.

4. What happened to the valedictorian?


Another funny thing that happened on my way to, and upon return from, my graduation, was the multiple requests from my manager for a copy of my valedictorian speech. According Wikipedia, Valedictorian is "an academic title of success used in the United States, Canada, Central America, and the Philippines for the student who delivers the closing or farewell statement at a graduation ceremony (called a valedictory)." Fair enough, but notice which country/region is not on that list? Graduation ceremonies in England, and certainly the one that I attended at Leicester, do not have a valedictory or valedictorian.

The intent of the good-humored ribbing was to suggest that I had graduated at the top of my class. But that's another thing my class did not have: individual ranking. When I got my Bachelors degree in 1974, the results for all the students were posted on the department notice board, a physical object in a specific geographic location. Going to the department and looking at the board was how I, and all my classmates, found out that I got a First (English universities used to rank degrees as First, Upper Second, Second, and something else). As it turned out I was the first person to get a Joint First in English and Religious Studies at the University of Leeds, and the only person to get one that year. But there was no list of results ranking my class. For my masters I got my grade via a website and that only showed one result: mine (which was Merit, one level below Distinction).



So it is quite possible that I was not the top student in my class. There were 33 of us graduating and none of asked about each other's grades - I think we were all just glad to have made it to the finish line, especially since most of us were holding down full time jobs, often in challenging places (like Kabul and Beirut to name two).

Indeed, whenever I was feeling like giving up I reminded myself that studying in San Diego was a lot easier than in a lot of the places my colleagues were coping with, so I should quit complaining, and besides, I was studying in my native language, which quite a few of my classmates were not (I confess that I'm awed by people who get a degree in a non-native language).

So in closing, but still speaking of languages, I promise my next post will be about the meaning and significance of the University of Leicester motto: Ut Vitam Habeant (here's a hint).

[Disclaimer: I have not yet written that blog post.]