A Blog dedicated to exploring privacy and technology

What it is like to be a diabetic …

Posted by Wayne on March 4, 2010

Its been a while since I’ve posted anything so I figured to get back into it this year I’d start by exposing some thing about myself – that I wish more people would talk about…   

   Today marks the one year anniversary of my finding out I’m a Type 1 Diabetic. With over 20 million Americans currently diagnosed with Diabetes and evidence that the number is going to more than double of the next decade – you might actually know someone who has this disease. In a recent American Diabetes Association Magazine article I read that every 20 seconds someone becomes a diabetic. Every 20 seconds! Wow.

I thought I would share some of the things I’ve gone through and learned in the process … so lets start with a year ago. Everyone asks did you have symptoms? Yup … did I ignore them – yes, for a few months. The symptoms were weight loss – 35 pounds in 3 weeks. I was going to the gym so I was thinking – wow this was working great. The problem was I really only wanted to lose about 15 pounds and what I didn’t know I was losing fat and muscle. The next symptom I had was that I was run down all the time – in fact I would get home from work (which took everything I had to get through and commute home) and would sit on the couch and fall asleep – sitting up. Now I’ve always found it easy to catch a cat nap but my wife knew something was wrong and was urging me to go see the doc. Nope, not me – I was fine. The final symptoms were I had to urinate all the time – every 20 minutes, and was craving sweetened drinks like Gatorade and coke which I’ve never been big on.

So on March 3rd last year, I remember that it was a Monday and that we had a snowstorm – so I opted to work from home. My wife said that was it – I was going to see the doc who probably had openings due to the storm. I went to the Dr’s and shared my symptoms and he took the usual fluids to test and said he would call me the next day. Three hours later I get a call from him and he said go to the hospital – now and to have someone else drive – because he was pretty sure I had diabetes and my glucose levels were off the charts. The healthy person usually ranges from 75-150 my levels were in the 800’s! Needless to say I spent a few days in the hospital while they stabilized me and spent 2 days teaching me what my life was going to be like as an insulin dependent person.

I thought I’d share some of the experiences – which for me has been a life changer. Not because I have to eat better and workout to stay healthy – because I already did that for the most part. No, it is more that for me I’ve always had relatively perfect health and never had anything serious happen to me health-wise. No broken bones, no hospital visits, no pills required – to having to test my blood 4-5 times a day, count my carbs, and inject 2 kinds of insulin into my body 4-5 times a day.

When I first started dealing with the insulin – I would scurry into the bathroom if we were out or hide in my office and test and inject. Why? I was afraid to have anyone see me do what I’ve done over 1500 times in the past year. At the time I was very self-conscious (still am somewhat), worried people would think I was damaged goods at work, and didn’t want people to stare at me. For the most part I’ve found people are mildly curious or have seen it all before and don’t care – plus I can get through the whole process in less than a minute at this point. Some people stare at me – but hey I have to get over it – and it is my lifeline and the technology and products are so much better now than even 1o years ago. I’ve got it easy compared to those who didn’t have the medical tech we have today or have had to deal with diabetes since they were small children – I’ve had decades of great health (and plan on many more).

I also went through the whole denial, anger, why me stuff. That took most of the last year – and still comes on in short blasts still even a year later. The Dr’s not being able to explain to me what caused it, what I should have/could have done differently, etc didn’t help me understand the question of “why me”. For my trip in this journey – I’ve become more focused on my health overall – more gym time, dialing in the food/diet/insulin ratio’s and reading/learning what I can about the disease and the medical technology and progress toward curing this disease. I’ve also learned to listen to my friends and family and couldn’t have made it this without their caring and support – they have been awesome. The last thing is I’ve dialed back  pushing myself quite so hard. It means things slip off the list until I can get to them – this is probably the hardest thing of all for me.

So one last bit before I end this posting. I had to share one moment that was really hard for me. I finished working out at the gym was stopping at Dunkin’s on the way home for a nice hot coffee and walked in and stopped and just stared at the racks and racks of donuts. For some reason it just hit me so hard – here was all this stuff which I liked to have as a treat once in a while – and it was all off-limits – why did I have to be different? The moment passed and I can actually have a donut once in a while (as long as I take my meds and don’t make it a habit). What I did learn is that – yes my life is different now, I have to plan my days and stay on track with my changed ways – and maybe someday Diabetes will be cured. In the meantime – I’ll do my part.

My call to action to you is – get your glucose levels tested, find out what your A1C  number is, watch your diet, exercise, and live long. Oh – and cheers to another year that we get to walk the planet.


Posted in Uncategorized | Tagged: , | Leave a Comment »

Privacy Knowledge – Solving for the X factor

Posted by Wayne on December 31, 2009

   I was toiling away today on the third version of my dissertation proposal (more like a complete redo!) and I came across a paper “An Ontology-based approach to Information Systems Security Management” written by three researchers from Greece. What struck me as an important idea from their paper was that they created a way to categorize security management into a framework that could be codified as a schema in a database (more on that in a sec) and most importantly was based on “security knowledge”.

  Why is “security knowledge”  (SK) such an important construct? To those who have been in the security business a long time – they probably already have a good idea of what SK is. SK is made up of a thorough understanding of context of the infrastructure, the collecting of requirements, the actions that are to be deployed, and value of information assets.  Most organizations have some or all of these elements in their security knowledge base – though it is likely diffused over several people or even departments or stored in multiple products. The diffusion of security knowledge  is not a trivial problem in its own though beyond what I wanted to try to get to in this posting.

As a privacy researcher, what struck me as a key concept in this research is that there is a  parallel need for “privacy knowledge” and it has to stop being an “also ran” to security knowledge. Privacy is important because privacy is based on Information and we are literally exuding Exabytes of information annually – much of it personal (70% according to this IDC link).

In the security domain one of the metrics listed by Hermann in her book “Complete Guide to Security and Privacy Metrics” is security policy management. I found it interesting that of the 972 metrics she listed there is not a corollary to security policy management which I would think would be called  “privacy policy management“. Yet with the Cloud or any Internet-based business these days – Privacy Policies are kind of treated as “ho hum – ya we got one” item – but have you ever read one (Microsoft has an excellent one that is easy to read , while Wal-Mart’s is easy to enough to read but is lacking in navigation friendliness)? Maybe when you do online banking you take the time to read it, but I’d bet a $1.00 you never read it when you sign up for another widget you want to use on Facebook (that the FB folks wash their hands of when it is not their widget! – What! you didn’t know their privacy policies were NOT transitive? ). So why not have a privacy policy management metric? Why not include the privacy policy as part of the Privacy Knowledge (PK) that the enterprise has to manage since the privacy policy defines a set of requirements that the what the company is adhere to when they DO something with your not-as-private information that they now have.

What if we had a full-blown PK though? To some folks this seems like slicing the onion a different way – but bear with me for a minute. What if you actually took all the SK steps and applied to privacy knowledge and treated privacy knowledge as important as we do security knowledge? The steps would be:

  • Get a full inventory of all your information assets (instead of the infrastructure itself) that applied to your customers, employees, and intellectual property.
  • Understand who has access to the information and can transform, store, or transmit this information (who being people, process, or technology!)
  • Extract privacy knowledge (privacy requirements) from the privacy policy
  • Associate the privacy requirements with privacy controls.  Control instruments would include rule-of-law, regulations (HIPAA, SOX, etc.), and internal business rules as defined in privacy policy.

   Hypothetically speaking if you took these steps and put the resulting information into a database, married it to the Privacy Rights Clearinghouse database, throw in some information asset values – you could begin to perform analytics against the policies. With the  historical breach data you could model potential exposures and perform what if’s. Minimally you could begin to evaluate privacy risk quantitatively. If nothing else you would be a lot closer to understanding X, X standing for the unknown.


Posted in Uncategorized | Leave a Comment »

Twas the Night before Cloudness

Posted by Wayne on December 25, 2009

Twas the night before Cloudness, when all through the Cloud
Not a creature was stirring, not even a squirrel.
The tweets were sent only 140 with care,
In hopes that a final, final, final cloud definition soon would be there.

The Clouderati were nestled all snug in their beds,
While visions of standards danced in their heads.
Dot.Gov and her ‘cyberchief, and OVF in his cap,
DMTF and OCCI messed with our brains before a long winter’s nap.

When out on the Vaporware there arose such a clatter,
I sprang from the Mac to see what was the matter.
Typed away on the keyboard, I tweeted like a flash,
Private & Public, for Hybrid – just add a dash.

The predictions had come, for 2010 like new-fallen snow
Predicted the cloud #fail, all of the vendors we know.
When, what to my cloud-washed eyes should appear,
But a transparent cloud provider, and eight controls simple & clear.

With a bunch of new VM’s, and security we designed to stick,
I knew in a moment SAS70, I wouldn’t pick.
More rapid than Cloud Security Alliance, then ENISA came,
And they covered their _aaS  and called out the controls by name!

“Now Bochagalupe! Now, Suredy! Now lmacvittie, Werner and Alverez (cloud Vixen)!
On, Ruv! On, WattersJames! On GeorgeReese, on Jamesurquhart and Randy-Biaz!
On Samj! On Aneel! On Mfratto! On ShlomoSwidler and Swardly!

To the top of the storage farm! to the edge of the firewall!
Now compute away! Compute away! Compute away all!”

As the competition heated up, like the wild hurricane fly,
When they meet with an obstacle, change their offers on the sly.
And finally the enterprise, to the providers they flew,
With the sleigh full of AWS, Google, and now Microsoft too.

And then, in a twinkling, I heard where’s the proof
Will my data be safe, VM’s can the hackers spoof?
As I drew in my head, an architecture that was sound,
I conferred with my cloudmates, not an exposure was found.

Along came one all dressed in fur, with a tail longer than a foot
Yet his nickname was beaker and he often says !woot.
He carried a bundle of Cloud Toys he had flung on his back,
And he told us about frogs, while opening his pack.

Others worried about interop and if there was privacy!
Some call the process a mobocracy and others an isocracy!
Some speak of the cloud and security as ones in the know,
One has his sock puppets and beard as white as snow.

They spoke not a word, but tweet’d while they work,
And filled all the demand, keeps them from going totally berserk.
And laying out standards, all gaps they expose.
Ready their clouds, for enterprise rose!

They sprang to their work, this A-team gave a whistle,
And away they all flew, like the shot of a missile.
And I heard the Clouderati exclaim, ‘ere they drove out of sight,
“Happy Cloudness to all, and to all a good-night!”

Posted in Uncategorized | Leave a Comment »

Veiled Transparency

Posted by Wayne on November 23, 2009

Over the past month I’ve been researching (mostly searching and the results have been #fail) cloud providers to understand what they use to “assure” trust.  In other words – if I’m a company is of sufficient size that risk outweighs convenience and I want to make sure that if I use the cloud – my site will be secure, my information will protected with the privacy controls I require for my business (be they HIPAA/HITECH, SOX, PCI, etc.) and the site will maintain good availability via service levels.

Many of the providers want you to believe that they are in fact transparent – which also happens to be the latest buzz word in the blogosphere (there are some great articles by Hoff, Randy Bias to name a few)  regarding the information that the providers is willing to put up on their website. On the one hand I’ve found that just getting this information as a non-customer is not an easy feat. Some providers like Google and Microsoft provide what I would call one-stop-shopping (e.g. Google Privacy Center, Microsoft OnLine Privacy) . They have a web page that gives you the core stuff like their terms of service, privacy policy, and either a security policy or at least a white paper on their controls. Microsoft has developed a full online Compliance Framework that they “appear” to be applying to Azure.

First lets tackle what constitutes transparency for the customer (customer being someone who is going to place a business ON the cloud providers systems). As a potential customer I’m likely to want to see the following:

  • SLA’s – What are the SLA’s? Are they Five 9’s? Four 9’s? What are the aggregated service levels if I’m a typical customer? So if I use your messaging services, your storage, and compute services do I get an aggregate of Five 9’s still or was one of the services at a lower level? Is there a service level credit? For example GoGrid provides a 10,000x credit for downtime while others just credit your minutes of lost time. Several providers will also actually provide the customer payments for business lost though this is provided via an insurance policy that the business could otherwise acquire.
  • External Audits – The rage seems to be SAS70 Type II audits which are designed to have provide an element of financial systems trust by having a third-party certified auditor look at security policies and procedures and determine what controls are in place and measure their effectives as they relate to an audit of their financial statements (see SAS70 details). This is great if you are trying to make sure the provider has controls in place that support SOX requirements but what about general PII/PCI controls or HIPAA? What about their security posture? I’d argue that this type of audit is a good “business” control type of audit. What is still needed is a Security and a Privacy audit to complete this picture. Minimally as a customer I would want to see the SAS70 audit results and the mitigation plans. While many of the providers have this audit done – none of them currently provider the results on their websites and would not provider the details to me via email though one provider did provide a list of the SAS70 control “buckets” that they include in their audit but they would not let me publish them via a blog or post to the web – to quote them it would “lessen it’s value” (not sure how that is – but they provided it for my research). One last note is that I found that ADP would provide the results to of their SAS70 audits to their paying customers for the areas that aligned to the services that they used.
  • Internal Audits/Assessments – It is assumed that the security and privacy policies need to be in place as SOP for data center operations. Along with the policies are internal audits/assessments that are performed with due rigor and regularity. The International standard ISO/IEC 27001:2005  is for security assessments and is considered by many to be one of the best and most thorough.  There are several other well know security assessments such as OCTAVE  from Carnegie Mellon which uses a Bayesian Model based on quantitative analysis of qualitative data and is designed to be used by internal resources. The US Government also has developed a series of standards in a he NIST SP800-53 standard The results of these audits are not generally available to “prospects”, actual customers, are not published, nor are they necessarily “honest”. For example – I ran across this statement on risk assessments from Pivot Points Security:

At this point Risk Assessments are a lot like a bikini; “What they reveal is suggestive, but what they conceal is vital”. Worse, it’s easy (and common) to make what they reveal what you want them to reveal.

Having performed and participated in OCTAVE, ISO/IEC 27001, NIST SP800-53, and COBIT audits myself I found out a few things in the process. Purely internal or purely external assessments introduce too much bias. OCTAVE was designed to be run internally because the subject matter expertise lie within the organization and employees/security staff have a better understanding of “asset value” (and I’m not going to get into the whole debate on usefulness of ALE/ROSI valuation methods). The internal bias could potentially be mitigated by having an external firm provide oversight and guidance. Perhaps the best (and most expensive method) would be to have an internal and external audit performed and compare them for patterns and gaps. Having run the operations for a managed service provider in the past it was my experience that we would have internal assessments run on off cycles from the external ones and the external ones would go through a “rough pass” phase, allow us to fix the most egregious problems, have a final pass run, and then the results would be provided to requesting and paying customers. If they weren’t both they didn’t see our security audit results.

One last comment on Risk Assessments – there is a new method name FAIR that Hoff pointed out developed by Jack Jones from Risk Management Insights that takes a different (and refreshing) approach to assessment methods. While most assessment methods rely heavily on interviewing and subjective qualitative data FAIR uses quantitative analysis for the asset valuation, threat impact, and also uses Monte Carlo simulations to pinpoint where the threats are most probably. This seems to be very unique because it is far more quantitative and makes it potentially far more machine readable/executable (I’ll expand on why this is important in future blogs).

  • Employee Certifications/Expertise – If you go back a bit in time to ASP/MSP’s vs. Hosting there was a line of demarkation that happened when you wanted help. The MSP had excellent subject matter expertise on the services they provided all the way up through the stack to whatever level they provided. If they were a database MSP they had experts in database, security, backup, etc. at your disposal. If they were a hosting provider – they stopped at the lowest level – they knew a lot about power/pipe/ping (physical security, power, cooling, core network) but they usually left the but they usually stopped at the lowest level and if you needed OS support – they may/may not provide if and if they do it is not included in the service.

When looking at cloud providers you should look at the experience and certification levels of the staff as part of your investigation. I would also suggest looking at “where” the talent is and what hours they work. For example if they use a “follow-the-sun” method – that may mean the staff you are using during your normal workday does a hand-off  when the clock strikes 5:00PM and you have to re-educate someone new who may want to have too much creative license on what the focus of the troubleshooting effort. No matter what – find out if they people working in support have names that are followed by the alphabet soup we are all accustomed to int he IT industry – CISSP, CIPP, CNE, CNA, MSCE, RSA/CA, etc. and make sure they have these certs from reputable organizations such as Cisco, ISC2, etc.

Also consider using a provider that is ITIL certified or at least has ITIL certified staff members. Why? Well for one ITIL was designed to improve the quality of service management by creating a framework of best practices for organizations to establish a service desk, a services catalog, and to measure service levels against. The latest version of ITIL v3 included the use of third-party providers extending the standard into the cloud/MSP/ASP world.

  • Miscellaneous – The final set of things to look at is – have these guys been in business a while? Are they solvent? What outage/security events have they had? Are they willing to provide you with the things listed above (and anything else you need) to make a good decision? Also make sure you really understand their billing model – some providers charge you for the “max used” or “burst” rate for the month while others do some averaging. Some include or group services together (such as DNS is part of network usage) while others are 100% a-la-carte and you need to pay for them separately. Perhaps someday we’ll see finer grain metering systems (due to competition) like with networks that tend to use 95th percentile billing that allow for short bursts.

One final thought on this tome I’ve written (assuming you read this far!) – consider what happens with your cloud provider when they are part of a set of service providers. For example if you are using one provider who gives you an easy portal to set up and manage your cloud infrastructure, then another behind that provides the core services (storage, compute), then another for backup/DR, then another for Security – start to think about the complexity (=risk), and does this aggregated service (what I like to call the service-stream) still have the security level, SLA, etc. that you had when you started? Do they have the same privacy standards and requirements? Are the protections transitive? Are they willing to test an outage and share the results or actually include you in the process (like you do internally when you test your DR test plan)?

In the end – you need to decide is the provider making it easy for me to understand how they do business with you? Are they open to sharing the controls/methods/etc? Or do you have to work really hard to find out what they really are doing on your behalf – don’t take the thinly veiled answer that it is for your protection that they won’t provide the information – you are the customer but if you are just using a free service – you get what you pay for.  If you are a real paying customer – then you don’t deserve to be treated with obscurity or directed to talk to someone else. The cloud is supposed to be self-service and automated – it is up to the providers to include in that service making it easy for potential/paying customers to get the answers they need to make their stockholders and customers happy.

Attached are the results of my looking at various providers via search and the web. It is incomplete – some sites had everything in one place making it easy to find. Others that have empty spaces are because after trying for hours I gave up. Could be my skills are not what they should be with search – but I think if a 25+ year IT vet can’t find stuff easily on the web or with search then you are losing customers already.


SAS 70 –,


Posted in cloud, privacy, risk, security, Uncategorized | Leave a Comment »

When Privacy Hurts

Posted by Wayne on October 30, 2009

I recently had the opportunity to participate in a Cloud Healthcare Summit sponsored by Microsoft and Capstone Partners in Boston. I generally find the networking at these events to be the best part of the event because I get to meet new people and catch up with old friends. If I also get to come way with ONE good thought or idea then I really consider myself ahead of the game. At this particular summit I learned several things that I think are going to be key to the understanding privacy as it relates to healthcare.

First and foremost the folks on the panel at this event were top-notch with amazing careers, education, and passion. Four out of the five panelists were software/solutions people who have spent decades working in the healthcare (HC)  industry trying to solve HC problems. The fifth person on the panel was the CIO from a large HC provider/hospital network (awesome to have a HC cloud customer on the panel).

The thing that resonated with me most was they all had problems with information silo’s (the customer and the providers). Systems that don’t work together and that aren’t designed to share information or that don’t provide mechanisms to even find information that is in the systems. Combined with these silo’s are what seems to be an age-old problem that has been solved for some time by larger enterprises – hardcopy information (or “trapped”) that has to be manually sorted, filed, updated and searched. Apparently HC still relies heavily on paper copies (yes nurse, I have filled out that form already 10 times – I swear).  Add to this they also are constrained by privacy issues.

WHAT? Did I just say “constrained by privacy issues“? Yes I did, and as a personal advocate for improving our privacy stance through education, technology, and regulation this may seem as a surprise to some. One of the entrepeneurs on the panel who also happened to be an MD took some time to give me a different view by explaining the artificial ceiling that regulation and policy has created with regard to patient data. From his perspective these have a three-fold impact on the HC business:

  1. Innovation is stifled – being unable to take samples of data and perform analytics on them or share information across systems.
  2. Research is slowed down – one example was being unable to take significant samples to provide statistically meaningful results has forced researchers to hoard the data they get because it is so hard to acquire and is tied to their ability to get future funding. 
  3. Medical errors are higher – consider being in an emergency situation where your records first have to be released then sent by currier or faxed or mailed via USPS. All these affect timely and responsive outcomes that impact a patients safety.

What this all boils down to is that privacy has to live by the same rule that security does – controls must be strong enough to protect while still enabling efficient and effective use. Technologies such as the cloud, bigdata, and XBRL are all technologies that will enable HC improvements but only if regulations and policies in HC change with the innovations. Regulatory efforts  such as HITECH are heading in the right direction – but as always we must find the right balance between protecting our liberties and safety as patients while improving medical science by leveraging information technology in new ways never possible before.

As always – looking forward to your comments!



Posted in Uncategorized | Leave a Comment »

Multi-tenancy: It’s not just for databases anymore

Posted by Wayne on October 16, 2009

This week I had the priviledge to participate as a moderator with some of EMC’s smartest innovators at the 3rd Annual EMC Innovation Conference. I got to ask them what their views were on multi-tenancyas it applied to the basis of their work – storage, virtualization of servers, and databases.

For the cloud – multi-tenancy is going to mean all of the above plus networks.

One of the analogies I’ve heard on multi-tenancy  floating around work for a while had to do with the notion of a motel where the rooms are temporarily occupied by a tenant. When the tenant doesn’t need the room anymore the room is cleaned out and then can be used by someone else. The motel analogy is certainly one that is a fair metaphor for server virtualization but seems to fall down with database multi-tenancy.

Database multi-tenancy tends to be a little more complex – not because it is hard to create a good “motel room” for the tenants – but because it is much harder to provide the tenants all the amenities they tend to want. By amenities we’re talking about the trade-offs in granularity, scale, performance, customization, and security/privacy. These trade-offs are tough enough to manage when they are inside your own data center but are much harder when out in the cloud world with external, unknown tenants who bring along a new set of risks for the provider and the tenant.

Network multi-tenancy has been around for years in the form of VLAN’s, NAT’s, and VPN networks which provide segementation and protection of connected tenants – or resource isolation.

Now – what happens when you mash these layers  together and make them all multi-tenant? Isn’t this an attribute of the cloud?

Yes – this is a key attribute and a requirement for much of the cloud (private or public). The design goals of multi-tenancy are pretty straightforward and should resonate with folks who have already embarked on the server virtualization journey or who :

  • Leverage technology – both the hardware and software  by creating shared resources for multiple users or uses (apps) while maintaining isolation. For the cloud this applies to all layers (net, server, etc.)
  • Improve cost structure –  improve repeatability by reducing customizations and sharing resources while maintaining resource isolation.

The challenges of building and supporting this type of architecture with regard to the cloud exposes a couple of  divergent requirements: 

  • Custom vs. Utility – How to provide a “utility” based economic model, support strong resource isolation for the tenants, while allowing “custom” and temporal workloads?
  • Siloed Tenancy – Multi-tenancy is currently silo’d to a layer of the infrastructure (e.g. DB is not related to Network multi-tenancy).

In particular I’m interested in the second one (silo’s) because I think the first one will work itself out as the market makes trade-offs and  incremental improvements. The silo issue is a bigger one – that I think we’ve seen public cloud providers Google App Engine and Amazon Web Servicessupport today (by obfuscation). With the delivery of new capabilities in the hypervisor vendors such as with VMwares VMware vCloud API and vApps which allow applications to run seamlessly across private or public cloud infrastructures – we should begin to see other layers take advantage of these kind of API’s.

One example I can see of this being a good path to go down is so that the role/identity problem that exist’s today within one layer is tough enough. Add in multi-tenancy – role management and resource access (protection) are critical to making sure a database with a shared schema protects the tenants data. Take that notion and expand it to the other layers. Now the identity needs to persists across the network, OS, and database while also allowing the identity to have multiple roles – such as group or department leader who may own managing access rights to a subset of the departments data by other. The department leader may also be a plain user of their own groups information, user of several other groups information, etc. This quickly can become complex and unmanageable (“just give me admin access!”) – so an easy to use and flexible identity management capability is just ONE of the major challenges of a true multi-tenant cloud.

What do you think?



Some urls and papers worth reading on multi-tenancy:

 Virtualization-based Techniques for Enabling Multi-tenant Management Tools

Architecture Strategies for Catching the Long TailMulti-tenant Data Architecture

 Agrawal, R., Ailamaki, A., Bernstein, P. A., Brewer, E. A., Carey, M. J., Chaudhuri, S., et al. (2009). The Claremont report on database research. Communications of the ACM, 52 (6), 56-65.


Aulbach, S., Grust, T., Jacobs, D., Kemper, A., & Rittinger, J. (2008). Multi-tenant databases for software as a service: schema-mapping techniques. 2008 ACM SIGMOD International Conference on Management of Data, Proceedings of the1195-1206.


 Aulbach, S., Jacobs, D., Kemper, A., & Seibold, M. (2009). A comparison of flexible schemas for software as a service. Conference on Management of Data, Proceedings of the 35th SIGMOD International881-888.


Candan, K. S., Li, W., Phan, T., & Zhou, M. (2009). Frontiers in information and software as services. 2009 IEEE International Conference on Data Engineering, 1761-1768.


Kaliski, B. (2008). Multi-tenant cloud computing:  from cruise liners to container ships.  In W. Mao, editor, Third Asia-Pacific Trusted Infrastructure Technologies Conference (APTC) 2008.  IEEE Press, 4.


Posted in Uncategorized | Leave a Comment »

A Question of Transparency …

Posted by Wayne on October 8, 2009

Yesterday I was on my monthly IEEE-USA CCPconference call to discuss the 4th quarters plans, papers underway, and plans for 2010.  The topics are all around communications policy which I always find fascinating. I volunteered about a year ago to be on this committee because I felt at the time that privacy/security and communications go hand-in-hand. I also wanted to learn how a group of scientists worked together to try and influence Congress, FCC, FTC, etc.

I’ve been honored to be involved with this group since then and I’ve learned so much. When I listen to them working I think things like:

  • Scientists don’t always agree! Who would have thought!
  • Agendas exist in all facets of life and work
  • Patience is a virtue
  • Wow – these are some wicked smart people
  • Some of them have clearly embraced social networking, some seem to fear it a bit.
  • All of them really care about science and this country

The last point is alway what I leave the meetings thinking about. Without fail several of the people I work with on this team really try to foster team work while also moving forward. Without that – this is hard work and since it is volunteer work you have to feel like you are working with a group of people on something important and feel your efforts are appreciated, or else why do it? This team does that for me every time and I alway enjoy the time talking, listening, and learning from them.

The topics for this meeting ranged from “new technologies for broadband access” to “FCC spectrum reform” to “VOIP” to “Privacy and Security(my favorite of course).

The security and privacy topic has been hotly debated by all – which has meant to me that it is an important issue to the team. The issue with this particular topic is that it is a “boil the ocean” topic area – especially when you add in the communications spin. So how do we resolve it? Well once again – the insights of people on the committee who have been at this a lot longer (and are smarter than me) took my executive summary for this paper and suggested two major changes:

– Make it about privacy and security “transparency” instead of safety

– Take an example from other work by picking and prioritizing the Top 10 issues to write about (instead of boiling the ocean).

Bam – that was it. I had struggled to get my arms around it for so long because it is such a broad topic and while safety is a huge issue – from a policy perspective it is hard to generate a lot of interest at this time in our history.

Transparency – If you think about the problems with technology and privacy it often boils down to an issue of transparency. Is information being gathered that you are not aware of? Will it be used in a way unknown to you? Will it be stored somewhere you didn’t intend it to?

Add to this notion of transparency with regard to National Policy and perhaps we can suggest some things when it comes to communications systems:

  • Are we clear on what the regulations are and what the ramifications are if they are not followed?
  • Do we have a good education eco-system that enables children, consumers, citizens to understand how their privacy is being affected by the entity they are engaged with (or follow-on entities)?
  • Does our system allow for innovation by fostering collaboration between our government, academia, and commercial entities?

These are important tenets that need to permeate our national policy and behaviors as we build and manage communications systems in this country. So in my next few blogs I’ll take a shot at what are some of the areas that need changees that will enable Transparency with regards to privacy and security.  Then I’ll try to begin to distill it down to my list of the Top 10 issues – we’ll see if I hit the target and it fosters some of you to comment on it.


Posted in Uncategorized | Leave a Comment »

Why Privately Exposed?

Posted by Wayne on October 3, 2009

Seems to me like privacy issues come up every day in the news whether it is good news, bad news, or just new regulations and laws about privacy – it is becoming harder to hide or be off the grid. I just googled the word privacy and got over 1 billion hits! The UK has installed over 10,000 cameras in 32 boroughs (dated 2007) and Washington DC has a penchant for the same type of privacy invasion with CCTV cameras appearing on every corner, at all public transport sites, all government buildings, etc

Add to this the fact that our lives are becoming a mere shadow of our digital existence and it quickly becomes an area that I think we should be paying close attention to. Security provides the instruments needed to protect our privacy – and privacy is information about us that we choose to share, we understand where that information is going to be seen and used, how it will be combined with other information, where it will be stored, how long it will be kept around, and have choices/consent when all this will happen with OUR information.

Add to that the “cloud computing” phenomena and now you have some really smart people scratching their heads about privacy in this new computing paradigm. What happens when information that used to be inside the chinese wall of the enterprise are now sitting in a 3rd party providers data center? Does the Patriot Act come into play in a different way than it did when the data was inside the enterprise? What about the use of 3rd parties that use 3rd parties? Do the protections flow with an “inheritance clause” or is each sub-level of agreement treated with a new service level and privacy protection level?

Like I said – the good news is some really smart people are spending a lot of time discussing  and working towards solving these issues – from government to academia to the enterprise. My hope is to help expose some of the good and bad of what is going on in the privacy domain (especially as it relates to cloud and the enterprise), put my opinion out there, show what I find in the research, and together we can come away with a new consensus on how to proceed.

Also – be warned – I’m here to learn and use what I learn in my research for my doctorate and beyond so I may want to contact you directly if you comment to find out more from you!


Posted in Uncategorized | Tagged: , , , , | Leave a Comment »

%d bloggers like this: