Posts tagged ‘computer’

every house needs a san

I know the fear you hold inside
I know what’s weighing on your mind
There’s nothing you can do
So stand up, get up, back up.

Ever notice how much digital stuff you accumulate? Notice how you have copies of copies floating around in different directories, on different machines? How much of it is important?

I’m not going to preach about backups… we all know what we need to do.

I can honestly say that I do it, and it has been a part of my life for many years. It saved me about two years ago when my laptop got stolen. Simply bought a replacement a I was fully operational within hours. It is just like insurance. You hate paying the price, but when the promise is made good, you feel it has all been worthwhile.

What would you do if you lost a movie? These days most people would kiss their DVD goodbye and use the misfortune to upgrade to a Blue-ray version. What if you lost an entire digital library? Maybe a little hard to imagine just yet, after all, how many people have digital movie libraries on their home network?

But what if the same thing applied to all of those great digital photos? Your MP3 library?

Is this why I’ve started to see SANs and Mirror drives starting to become a consumer category? Not too many at your local big box electronics store just yet, but certainly a whole range on-line for the serious hobbyist consumer.

Will it ever hit the mainstream? Or will the general digital consuming populace opt for consuming ‘long-form content’ on-line? Pipes into the home are now sufficiently affordable to support streaming, and those Blue-ray players, Xboxes, PS2s, Wiis, TVs and PCs etc. support some type of commercial streaming service (like a Netflix, Blockbuster etc.)

I have seen my children lose files. Having grown up with computers, they accept it as a normal thing and source another copy from the web immediately – even though they can get the backup quite easily. So it leads me to a conclusion that the consumer of tomorrow is more likely to see the ‘cloud in the web’ as their information/data/content repository and not really be bothered with a SAN in the basement. I, on the other hand, will continue to indulge myself and keep a copy… just in case.

Tell me it isn’t so… I’m listening.

April 1, 2010 at 7:14 am Leave a comment

digital asset management

Talent is an asset
You’ve got to understand that
Talent is an asset
And little Albert has it
Talent is an asset
And Albert surely has it

In my wanderings across the media landscape I have encountered, “Digital Asset Management”, “Media Asset Management”, “Content Management”, “Asset Management” etc. All terms which each of you have similarly faced. But, why so many terms for the inflection from tape to digital?

I believe the confusion arises out of workflow, product functionality and the need for vendor differentiation. Obfuscation is a remarkable sales tool, and perhaps this taxonomic confusion exemplifies the case of “those who cannot do – sell.”

On a much kinder note, perhaps the confusion is a natural response to a nascent technology, one born out of digital abstraction of a physical entity. You see, when you can touch something, as a human, your perception is that it has some innate value. When you cannot touch it, the value is diminished. Unless we actually have equity in the content, we lose the connection between perceived financial value and the invisible bits which ultimately express the value. In fact, we ascribe value to the infrastructure which enables those bits to be expressed i.e. network, PC, iPod, Plasma TV, STB etc. My theory for why even seemingly upstanding citizens engage in dubious ‘pirate’ activities. I also encountered this mindset in China in the mid 90’s when the value of a CD was in the medium i.e. the plastic, more so that what was on it. Similarly when couriering software and data mag-tapes across continents, customs officials always wanted to know the value of the tapes, not really understanding the value of their contents.

For the record, I gravitate to the term digital asset management. The core of understanding lies in understanding Assets themselves. First a little definition.

Assets have three essential characteristics:

  • The probable present benefit involves a capacity, singly or in combination with other assets, in the case of profit oriented enterprises, to contribute directly or indirectly to future net cash flows, and, in the case of not-for-profit organizations, to provide services;
  • The entity can control access to the benefit;
  • The transaction or event giving rise to the entity’s right to, or control of, the benefit has already occurred.

In normal speak, this is an ability to

  • make money or provide services
  • maintain control or access
  • leverage transactions

This seems intuitively complete for our understanding of the things we need to do to digital content regardless of type. Yet how many digital asset management systems have financial interfaces? Furthermore, how many effectively enable their content with the aforementioned characteristics? How many are just plain digital ‘libraries’ or ‘content management’ repositories?

In the financial accounting sense of the term, it is not necessary to be able to legally enforce the asset’s benefit for qualifying a resource as being an asset, provided the entity can control its use by other means.

This raises another interesting encounter with a recent customer. As a very large content creator, they were not so much interested in protecting their content with DRM as their main customers were Service Providers like broadcasters, cable companies and telcos. They distributed their content via the web, in most cases with FTP, and fully expected their business partner to abide by contractual usage provisions. In fact, if the content were ‘over-utilized’, then in a sense, that is fine by my customer – more exposure. However, their prime concern was ‘editing’ of the content, i.e. changing the creative content, or the storyline, or the brand. As long as the content was edited in compliance with regulations that was OK, but if it was edited to allow more commercial content, then that was not OK.

By extension, media asset management expands my definition of digital asset management, to include the tracking of physical copies of the content i.e. tapes, CDs, manuscripts etc. Hence the usage of the word media, implying all manifestations of content instantiation.

So, when you use the phrase ‘asset’, it is important to understand that the digital content has a ‘value’, and that this value needs to be wrapped with rights to ensure that the maximum equity is extracted for the minimum liability, financial or otherwise.  Failure to do so, in my mind implies that you are only addressing one part of the business, the technical operations.

Without an appreciation of the valuation of their digital inventory, how is a business ever going to evolve to a new digital business model?

Tell me it isn’t so… I’m listening.

February 18, 2010 at 1:00 am 2 comments

making toast

If I could be like that
I would give anything
Just to live one day
In those shoes
If I could be like that
What would I do?
What would I do?

From the time programmers first tried to interact with the CPU, we have had several paradigms for human computer interaction. From switches, through paper tape to cards to CRTS to field verification to form based transactions to GUIs and even styluses (or should that be styli?). Through this evolution, two fundamental approaches to human-machine interaction have emerged. I’m sure the Interface Engineers out there will offer the appropriate terminology, but the two approaches are characterized by being ‘application-centric’ and ‘object-centric.’

Application-centric, really focuses on ‘loading the application, to access or create the data from within the application environment’, whereas object-centric is more in line with GUI approach of ‘activating the document/object and having the operating system instantiate the application and the environment to use the data.’ The latter is a very content-centric approach, one that I subscribed to until quite recently. Let me explain my change in thinking…

It all comes back to toast.

In the days before toasters, people put bread near a source of heat to make toast. It didn’t matter whether it was a wood fire, electric elements, gas burner etc. Take bread, add high heat, bread gets toasted. Toasting was a technique. A verb that subsequently became a noun, the inverse of nouns becoming verbs like Fax, Fedex, Google.

One day, as energy was channeled via a mass market utility called electricity, the ‘toaster’ was invented. It was a good appliance. It got better over time with more features to prevent burning, to defrost frozen bread, to understand different thicknesses etc.

But for all of it’s newly inherited capabilities, the toaster transformed bread into toast. And the consumer liked the toaster.

People like appliances. They are easy to use because their role is well defined. Just think of the plethora of successful inventions – invariably they have resulted in appliances or devices that are ostensibly single purpose. Cars, refrigerators, phones, ovens, lamps, cameras etc. In fact, if you have seen those combination appliances such as the coffee-maker/breakfast cooker, or the hot-dog maker (cooks franks and toasts buns), you’ll find that they are not best sellers. Why?

I think that the consumer is prepared to pay for specific functionality, not for stuff that they don’t need or understand. Also, they generally use more complicated (or potentially dangerous) appliances in a single threaded manner. Drive car (doing other things at the same time is dangerous). Microwave food (not metal, careful on timing). Saute food (make sure gas is lit, not too high, watch contents or it’ll all burn). Speak on phone (rude to not give the other end focused attention, otherwise why call?). Twitter (and listen to music, or just hear ear candy?).

It’s about focus.

Contrary to contemporary thinking, for effective, focused deep communication, the human brain is generally single threaded. Engaging in effective communication and processing deep concepts requires focused cerebral compute power. Yes, you can multi-task and time-slice your way through many things at once, but at what price comprehension?

Back to application-centric machines.

The iPhone is a perfect example. It is what you need it to be at that point in time, phone, browser, diary, etc… you don’t care where the data is stored – as long as it is! So you don’t need a file manager. Same for the newly released iPad. I believe we are seeing a new era of interfaces – more than just ‘app’ centric, it is really appliance or function-centric. Dare I say ‘wrkflow-centric’? Keep the tasks very focused, do a really good job for the consumer, then enable them to change appliances effortlessly. We’ll see some appliances running in the background, like music players, communications, content downloads etc. just like the refrigerator keeps things cool, and the bread maker makes my next loaf for more toast, while the coffee maker makes the coffee. I’ll still need a toaster, and then I have to pour the coffee and add milk from the fridge. Multi-tasking will not go away, but how we focus on our tasks will need to become more efficient and less disruptive than clicking all over a desktop, or rummaging through unruly digital filing cabinets…

The appliance-centric model provides us a clue to the future. A future that is focused on using information, rather than ‘managing information’. This needs to be the goal of computing for the mass consumer marketplace.

Now, we just need a standardized information utility grid to plug into the information… or is that the ‘cloud’?

Tell me it isn’t so… I’m listening.

February 4, 2010 at 1:00 am Leave a comment

computing utilities cometh

It’s quick. It’s clean. It’s pure.
It could change your life, rest assured.
It’s the 21st century cure,
And it’s my job to steal and rob GRAVES!

Computing and IT is entering a new phase. Think of it as the industrialization of information processing. The glory of having the fastest chip, or the fastest router, or the fastest drive, in fact any fastest computing component is quickly becoming Pyrrhic in nature. It’s all about the computing ecosystem.

Some call it cloud computing, others call it scalability – in fact it is something very different. It is the computing utility. Think of it this way…

When technology companies created systems or solutions, they were focused upon a very horizontal marketplace i.e. all things to the most people, or a vertical i.e. specialized systems for specific applications. But when you think beyond systems and focus on absolute efficiencies of the individual components, and then configure those components to interact as an eco-system, rather than just a system… well, it’s like designing communities instead of architecting houses. Heat, power, traffic, serviceability, security etc. all take on different meanings. It is more than just how powerful an individual component performs.

And so it is with the new data centers. Reliability is not about five 9’s – it is about 100% up-time. It demands new engineering approaches. It is about questioning everything – even down to whether an LED is essential on a device, whether the rack is designed for maximum airflow, whether the air-conditioning leverages real thermodynamic principles, whether power is properly conditioned, whether the cost per computing unit is the lowest and most economical so that your users can build viable business models.

These new data centers are not only providing the compute utilities of the future, they are inventing new technologies, IT components and processes. Most importantly, they eat their own dog-food instead of extolling the virtues of their design with clever marketing paradigms as do technology purveyors.  They are their own reference. They can provide real performance and reliability metrics, and many are making their approaches truly ‘open’ by showing you how to do it. This is a very disruptive trend, because if these companies start doing it themselves, because the traditional companies cannot solve these engineering problems, then they will become the technology companies of the future… much like power utilities. At some point, what will be the incentive to compete and innovate? For an even lower cost per compute cycle…?

Edison may have been a visionary, but it was Tesla that pioneered the concept of utility electricity. Edison wanted everyone to buy their own generator, and keep buying services, wiring and light bulbs… from Edison.

I think we’ve reached a new tipping point in IT.

Tell me it isn’t so… I’m listening.

December 10, 2009 at 1:00 am Leave a comment

what’s in a name? clouds

My thoughts are scattered and they’re cloudy,
They have no borders, no boundaries.
They echo and they swell
From Tolstoy to Tinker Bell.
Down from Berkeley to Carmel.
Got some pictures in my pocket and a lot of time to kill.

There’s a lot of confusion over the term ‘cloud computing’. Some see it as marketing buzz, others see a new technological approach, some a business model shift.

It’s all of the above.

We have all seen the inexorable shift to the network. Before the web, there was thin-client computing, before that, dumb terminals. In fact a very notable technology company pronounced well over 25 years ago that the ‘network is the computer’. Prior to that, Ken Olson of DEC saw no need for personal computers, and Thomas Watson of IBM saw the need for perhaps only 5 computers in the world. In fact, if you combine all of these seemingly orthogonal declarations, you could conclude that with the fullness of time, there may be only a handful of computing utilities in the world, and in fact they were all correct.

There has always been a pendulum shift between centralized and decentralized management of computing resources and information. Sometimes these swings were in sync, at other times they were in opposition i.e. centralized information with decentralized processing and vice-versa.

Regardless. Everything is moving to the network. So it would seem to make sense that in order to bridge both processing and informational requirements we had to develop technologies to enable that shift in an economical manner. Economics of deployment, ownership and migration all being considered.

There is a distinction between the technology that enables ‘cloud computing’ – virtualization… of infrastructure, virtual machines and software stacks all coexisting on, and simultaneously leveraging the resources of utility computing. And, the applications which use that technology. Hence the confusion.

Talk to a techo and the discussion will be about the technology which makes this happen and detailed explanations of why their cloud is better than the competition. Talk to a business-focused person and they’ll tell you that delivering services on a network infrastructure, or as a service, has been happening for years, and this is all not that very new. Both are right. Context.

Confused? Well, just like .pdf and html clearly derived from LaTex, so clouds draw upon previous generations of computer architectural approaches. At a philosophical level there really is no difference, clouds really are just ‘water vapor’. However, when you dig into the detail, there are developments in the technology and business opportunities that may, and should, be leveraged.

Why not see if it works for you and your business? If the math works, all variables considered (including risk and other intangibles), then there is no need to be comfortably numb… if it doesn’t, then at least your math will tell you why. But please, don’t let the hype and your own personal lack of diligence prevent your exploration… and your opportunity to learn.

Tell me it isn’t so… I’m listening.

October 22, 2009 at 1:00 am Leave a comment


I fell down in the desert baby, yeah
I had nothing but a peice of paper, oh yeah,
I had to write something down,
And I found myself alone, then I let go of everything,
Into another dimension

Another year and another NAB. Although I typically post on a Thursday, I was dragging by the end of the show. Getting home late after such a draining exercise was a great reason to postpone any writing effort.

NAB2009 South Hall lower, looking back towards upper level

NAB2009 South Hall lower, looking back towards upper level

A couple of observations.

Attendance was down. No surprise. But the interesting side-effect of this was that the quality of the meetings I had, on average was far better, with more engaged people. There were fewer tire kickers.
Time was important. Most NABs, people are late for meetings, if they show up at all. This year, for some reason, I cannot think of one scheduled meeting that was late or cancelled.
3D is here. Seems as though most other stuff was about speeds and feeds, or derivatives of existing products and processes. However, there was a real lot of innovation happening in the 3D world, focused on production, presentation technologies and general focus on getting this technology to the consumer. It’s just around the corner folks.
Business Models. Sun held our Media Advisory Board and the topic was on advanced advertising. I was amazed at the level of interest, the degree to which some organizations are investing and building their business models around it, and most importantly, the lack of common language as to what it all means. Even our friends in the advertising world are not in agreement. Sun also conducted a test of a pilot Video As a Service for IPTV delivery.

Video as a Service test pilot at NAB2009

Video as a Service test pilot at NAB2009

The economy. Every year at NAB there is a job board, but this year, it was busier than ever.

    One of the many job boards at NAB2009

    One of the many job boards at NAB2009

    Tell me it isn’t so… I’m listening.

    Looking forward to seeing you all next year.

    April 24, 2009 at 9:44 am Leave a comment

    why molten lead?

    As bits to the screen, so are the days of our lives…

    In the late 80’s and early 90’s I was involved with a publishing startup. We produced the first full color ‘vogue quality’ magazines targeted specifically at verticals within the computer industry. Starting in a spare back room and outsourcing stuff we didn’t know how to do, we quickly progressed from typesetting to an early version of Ventura (running on GEM), though to a network of PC’s running Quark. Yes! wincrazy eh? We should have been using Macs, but they were too expensive and we loved VMS, unix and they were a bit more interoperable than MacOS at the time, and certainly PC’s that we built ourselves were a heck of a lot cheaper. We went from one solitary magazine to five, and by the way, the magazines were free!

    A couple of things we learned. Mr. Packer and Mr. Murdoch had just replaced legions of typesetters and presses with digitally enabled .pdf based printers that signaled the end of printing as we knew it. They were starting to produce color newspapers! We also learned that ‘desktop publishing’ (remember that cute phrase) for the masses really meant trying to see how many fonts and colors could be crammed onto a page. We also learned that you could take a skill, like good page design and repackage that skill with a different tool and you could start a new ‘cheap as chips business’ and make some very good money. Targeted advertising for an industry segment leveraging free content!

    So… adding that experience to my varied experiences in media leads me to the following conclusion. A conclusion by the way that I have been espousing since October 2003 (if you’re interested I can tell you why that date).

    Here it is. The world of video is changing. Captain obvious you say! But just as printing went from molten lead to bits, so video moves from BNC to bits and from video engineers to IT engineering. Still obvious. Printing evolved through this transition almost 20 years ago and is now losing serious money as a segment. We know that too. This time, video will meet the same fate, but it will not take 20 years! Look at the news. The content owners and distributers want your identity, not because they want to see what you’re ripping off, but rather because they want to prop up their model. Too little too late?

    Why and How? This, I’ll explain in forthcoming blogs.

    Tell me it isn’t so… I’m listening.

    March 5, 2009 at 10:23 pm 2 comments



    My Tweets

    June 2020
    M T W T F S S