January 16, 2009

Streaming Video Playback Speed Controls - Two Innovative Methods

One of the coolest playback features for online video, especially academic video, is a player with the ability to speed up (or slow down) the playback speed of a streaming video.  Way back in the early 2000's there was a tool called Enounce that acted as a plugin to RealPlayer or Windows Media Player and would add a control slider to the player.  Everything from half-speed to 5x playback, with no pitch change on the audio.  It was very effective for watching lectures or news content - for much material, you can really absorb it much faster than it's spoken.  Turns out that Enounce is still available, and works pretty well, and they announced a version called MySpeed which supports embedded Flash video.   

End-users can buy and install Enounce and use it on their systems.  It's a native Windows-only application and must be installed individually on each system.

OK, that's great, but I want this as a feature of my website - I want all my Flash videos to appear with a speed control for all users.  To date, I'd been unable to find any way to do this - no one I've spoken with seems to know how to write code for Flash Player that will permit a speed control.  I'm told it's currently not possible.  

Then I came upon Bloggingheads.tv.  Bloggingheads.tv includes a Flash-based player (derived from the JW Media Player 3.2)  that has a "1.4x" button that bumps up the playback speed -- perfectly intelligible, but much quicker playback for taking in a long talk in a jiffy.  They did the impossible!

I had to know how they did it, so I did some poking around. Turns out they didn't do the impossible, they did an end-run around it.  The playlist that their flash player reads for each video program references two media files.  Here's the relevant code snippet from the XSPF-format playlist:

<location>
    rtmp://mirror-image.bloggingheads.tv/bloggingheads/flash
</location>
<identifier>bhtv-2009-01-13-pb-jg-100x.flv</identifier>
<meta rel="alternate">
   rtmp://mirror-image.bloggingheads.tv/bloggingheads/flash/bhtv-2009-01-13-pb-jg-140x.flv
</meta>

So, they created an alternate encoding of each video, one with the 1.4x timeline baked right in.  The player needed some modification to play this, but only so that the time, duration, and the location bar all showed an appropriately scaled value as this video played. After all, a 30 minute video encoded to play at 1.4x is actually only a 21 minute file, but the timeline still needs to show it like it's the 30 minute length of the original content.

When you switch from one speed to another while playing, the stream rebuffers and seeks to the same spot in the video, so there's just a momentary pause in playback switching from one stream to another.

It's a great workaround - although for my purposes (user-generated content, thousands of contributors) I'd still prefer a player-based way to do it so it can apply equally to video from all sources without requiring added backed processing.  Still...this is the only solution I've ever seen to this issue a) for Flash video, and b) not requiring an additional plugin.

September 11, 2008

Tools for testing streaming media

So here's a neat trick for testing how streaming media and web applications perform for users with limited bandwidth connections.  OK...I'm getting some rolling eyes at the premise. "Does anyone have limited bandwidth connections anymore?"  Turns out that, "Yup...they do."  

Two cases in point:  
  • Last fall, the Harvard Alumni Association offered an all-online version of the popular undergraduate course, "Justice" to Harvard alumni and their invited guests.  Over 3000 participants signed up for the term-length 24-lecture course, delivered via Flash streaming (rtmp) video encoded for broadband (~400kbps). Most common technical complaint from users: video performance and rebuffering due to insufficient bandwidth. 

  • So, I've got one of those Verizon Wireless Cellular Modems, a little USB one. Great 500kbps+ broadband in the city, but when I go on one of my frequent trips to coastal Maine, I'm lucky if I get 80kbps.  Actually, I really am lucky, as the alternative is 56k dial-up.  Wide swaths of the geography are not covered by cable or DSL, and cellular is the best I can do.  People have to drive into town to get Wifi at the local cafe, or suffer with a slow connection from home.
So, even among a net-savvy demographic of people who otherwise have resources, there remains a small but significant need for low-bitrate video solutions.  Currently, we're encoding a new set of programs for multiple bitrates all the way from HD to dial-up, and testing has been an issue.  Two techniques have been lifesavers.

For Windows, Netlimiter, an inexpensive bandwidth simulator utility I wrote about on streamingmedia.com (and commented on here) a while back, lets me set my throughput to whatever I like. I can pretend I'm on my coastal-Maine cell-modem, or a dialup, or anything else easily.

For the Mac, it's already built in to the OS's Unix roots.  It's in the ipfw command.  You set up the bandwidth limits by creating filters with bandwidth limits, then associating those filters with the ports you want limited.  Here's how to set up a bandwidth limiter for testing rtmp Flash streaming (port 1935). Note that if you're not logged in as root, you will need to use sudo to run these:

sudo ipfw pipe 1 config bw 400kbps
sudo ipfw add 10 pipe 1 tcp from any to me 1935
sudo ipfw add 11 pipe 1 tcp from any 1935 to me

Change it at will by issuing the pipe command again...

sudo ipfw pipe 1 config bw 1400kbps

You can also introduce simulated network latency, control outbound bandwidth separately from inbound, and control bandwidth to or from a single IP address or subnet.  There's great documentation at Luigi Rizzo's Dummynet site.  Thanks also to Ask Bjorn Hansen for his mini-tutorial on this.

April 04, 2007

Fundamentals of Website Development - Course Resources

Last night I had the privilege of giving a guest lecture on streaming video in Dave Heitmeyer's Harvard course, Fundamentals of Web Development.  As a follow-up, here are some further information and references on topics that came up in the class.
Anything I forgot..?  Leave me a comment and I'll update this as needed...

March 27, 2007

A Full-Featured Flash Video Player

Flash video is great for users, but the player lacks easy, built-in features Web developers have come to expect.  But now, Jeroen Wijering has developed the full-featured Flash Video Player 3.6 which finally makes all the features of a "real" video player available to Web developers using Flash video on their sites.

The standard video players - RealPlayer, Quicktime and WindowsMedia - all have APIs that help make it easy to embed interactive video onto a Web page.  The major video platforms provided simple run-time customization capabilities that developers have come to expect from video platforms.  By setting values in either the web page or the metafile (.ram, .asx, .qtl), you could accomplish a lot:
  • support for metafiles that can be generated on-the-fly
  • playlists
  • background colors and logos
  • captioning
  • control over the appearance of the player controls
  • fullscreen mode
  • autostart and repeat behavior
Naturally, if you are a Flash developer, you can make a player that handles all of this.  Indeed, unless you're simply hard-coding an .flv URL into the stock Flash video player, you have to do Flash development to make a more capable player.  Jeroen's Flash Video Player 3.6 solves all that.  With an elegant API that works through metafiles or FlashVars, you can customize the playback experience without having to do a lick of Flash development.  What's more, a full Javascript API includes controls (playlist navigation, play/pause, scrub and seek, volume control, and movie loading), Javascript callbacks, and metadata extraction.

This player covers all the important bases in terms of the video player capabilities Web developers need, and makes publishing Flash video as easy as publishing Real, QT or Windows Media.  It's distributed under a Creative Commons License, free for non-commercial use, and nearly free for commercial use.

January 04, 2007

User-Generated Media - Challenges & Solutions for Business and Academia

Social networking and user-generated content (UGC) sites present unique technical challenges, which lead to unique business challenges.  While unexpected growth is a potential problem for any online site, it is both the holy grail and (in the spirit of "be careful what you wish for") a ticking time bomb for social networking sites. 

A new whitepaper from Akamai (also available free from streamingmedia.com) goes into some depth about the special factors that affect social networking sites.  Some highlights:
  • User-generated content sites are the fastest-growing category of web site (by unique visitors) on the Net, showing, in some cases, triple digit year-over-year growth. Of the ten fastest growing web brands, five are UGC sites (for example, Flickr and Wikipedia). 
  • Social networking/UGC sites have, by definition, unpredictable storage and bandwidth needs, making technical infrastructure (and therefore, budget and capital expense) planning a crap shoot.  Outsourced capacity on-demand is an important option to consider before you're faced with site-crippling runaway success. 
  • Success is tied closely to having a fast innovation cycle -- try stuff out, see how it works for your users.  Continually sense-and-respond to user needs to find that sweet spot of simplicity, functionality, and sustainability that makes your site sticky and social.  One way to do this is to minimize the time and effort you put into infrastructure build-out and put it into more creative endeavors. 
  • If you're an ad-driven site, performance is directly tied to revenue, as faster loading pages keep eyeballs on the site, lead to more page views per user, and therefore register more ad impressions.  When Friendster moved to Akamai's delivery network in March 2006, they saw an immediate 33% decrease in page load times, and a threefold uptick in page views.
Even for an educational institution, outsourcing certain infrastructure is appealing.  With service-oriented Web APIs, it can be easier now to work with a vendor/partner than it is to build it myself.  If I want to put up a quick video recording/encoding/sharing service for my users, I can:
  • Build it myself - not always a bad idea, and definitely a quick-and-dirty solution for a pilot or proof-of-concept, provided I have to staff and the time to move it from P-O-C to production-ready if the need arises.  
  • Acquire and deploy an inexpensive product.  I was surprised to find YouTube clones like Clip'Share and Altrasoft VideoShare for a few hundred bucks or less.  Again - good for a proof-of-concept.  May or may not offer enough for coping with real success.
  • Use a Web Service API like that from Video Egg or JumpCut to handle all the media operations, while you focus just on your website.  These services handle media input (in the case of Video Egg, from webcam and cell phone, as well as file upload). transcoding, online editing and delivery.  It can provide a platform for rapid development of your own custom solutions, as well as a scalable solution in case your solution takes off.  
I'm generally a big fan of institutions building their media solutions in-house, but the combination of the unpredictable needs of user-generated media, the ease and excellence of some of the vendor service-based APIs, and the need to be able to innovate quickly without up-front investment in big infrastructure creates some interesting possibilities.  

The Akamai white paper, Successful Social Networking and User-Generated-Content Applications: What You Need to Know, (which, by the way, I wrote) addresses some other challenges of social and UGC sites -- how edge-caching works with dynamic content, how to control costs when growth is unpredictable, options for exercising editorial control over UGC sites, and some examples of how social networking is being used by businesses to build revenue and create new opportunities.  

October 19, 2006

Code snippet to embed video in a page

YouTube and its ilk have made embedding video into a web page simple for people who are not developers and HTML gurus.  For institutional video installations like ours at Harvard, it can be just as simple for our users to embed internally hosted video in their course pages, Websites, and blogs.  All you need is to have a small Javascript file that generates the HTML that embeds the player.  This file lives somewhere on your Web server, and people wanting to embed video in their pages simply reference it with a small snippet of HTML they put into their Web page.  Here's a simple snippet of HTML that users can use to generate an embedded video player:
<script src="http://www.learningapi.com/blog/scripts/embedRealVideo.js" type="text/javascript" 
clipUrl=""rtsp://video2.harvard.edu/newsoffc/EOWilson.rm" >
</script>

The embedRealVideo.js script generates the EMBED statement that displays the video in the page.  Its source code can easily be modified to support Windows Media or Quicktime plugins as well.  The user embedding video just has to paste the above code snippet into their page, making sure to edit the clipUrl field appropriately.  For this RealPalyer example, that URL can be a direct rtsp:// link, or an http:// link to ramgen or a .ram file.  

Here's the source of the script, embedRealVideo.js: (you may have to remove the line wrapping in the document.write statements for this to work)
pClipUrl="";
var scripts = document.getElementsByTagName('script');
var index = scripts.length - 1;
var myScript = scripts[index];
if (pClipUrl=="") {
    pClipUrl=myScript.getAttribute("clipUrl");
    }

document.write('<embed type="audio/x-pn-realaudio-plugin"
src="'+pClipUrl+'"
width="320" height="240"
controls="ImageWindow" autostart="FALSE" console="Clip1"></embed><br>');
document.write('<embed type="audio/x-pn-realaudio-plugin"
src="'+pClipUrl+'"
width="320" height="30"
controls="StatusBar" autostart="FALSE" console="Clip1"></embed><br>');
document.write('<embed type="audio/x-pn-realaudio-plugin"
src="'+pClipUrl+'"
width="320" height="26"
controls="ControlPanel" autostart="FALSE" console="Clip1"></embed>');


August 01, 2006

How do you measure the value of Instructional Technology, or any technology investment

People try to bean count while investing in "enabling technology".  They attempt to put into financial terms the value of a content management system, or adopting streaming video, or an intranet portal -- all the while looking for that bottom line to justify the cost.  Vendors publish whitepapers that try to put hard business-case numbers to convince IT shops to make an investment in a certain platform or technology.

"Switch to Flash video and you'll save $350k a year!"  It's possible.  Although when you look at the details, the ROI numbers seem more like estimates built on assumptions bolstered by guesses.  Perhaps, "Use Flash video and you'll build customer loyalty due to the excellent user experience (which can help lead to increased sales or market share)" is more realistic, if less quantified.  

This is the topic Harvard Business School professor Andy McAfee's latest blog post, The Case Against the Business Case.  Andy notes that using hard numbers to justify IT investment is natural -- numbers are the terms that business people traditionally use to measure cost and value.  But he points out that the chain between cause and effect of IT innovation can be long, complicated, and nearly impossible to quantify.

I’ve probably seen hundreds of business cases that identify the benefits of adopting one piece of IT or another, assign a dollar value to those benefits, then ascribe that entire amount to the technology alone when calculating its ROI.  The first two steps of this process are at best estimates, and at worst pure speculation.  The final step gives no credit and assigns no value to contemporaneous individual- and organization-level changes.

Some leaders instinctively have a sense of what kind of investment is going to lead to these intangible benefits.  They seem to naturally turn an organization towards the kinds of IT investments and organizational structure that's capable of capitalizing on IT innovation.  Yet, these leaders often have an uphill battle convincing the rest of their organization to follow when an investment does not have the hard numbers to provide assurance.  Andy notes:

One half of the ‘classic’ business case— the costs— can be assessed in advance with pretty high precision.  We know by now what the main elements of an ERP, BI,  Web enablement, systems integration, etc. effort are, and what their cost drivers are.  And we also know the capabilities that different types of IT deliver if they’re adopted successfully—if the human and organizational capital are well-aligned with the information capital.

It's this last part that interests me.  Certainly, the benefits of some IT innovations can be measured directly.  A recent Boston Consulting Group study on innovation reports that  corporate spending on "innovation" is up, even while companies are not satisfied with the results of prior innovation spending.  Among the companies studied, the most popular metrics for measuring innovation were time to market, new-product sales, and return on investment.  For product development organizations, these might be good measures.  

But what about service- or knowledge-based organizations?  How do justifyable decisions get made involving technology investments?  In my experience, the most sustainable technology innovation comes from organizations with visionary leadership and a culture that provided creative staff the freedom to explore, experiment, and sometimes fail.  As Andy alludes, it's about both the nature of the innovation, and the ability of the organization to capitalize on it.  But even with that, some paths lead to real business benefit and others do not.  Perhaps you can't measure which are which, but many feel that they can tell the difference when they see it.  How do you tell the difference?  

July 16, 2006

One-click iTunes Podcast Subscriptions

While building out the podcasting features of the Videotools Media Content Mangement system for Harvard Business School, we were trying to figure the simplest way to do a "one-click" RSS subscription for iTunes users.  Before we explored it, we had some open questions: do you have to register you feed with the iTunes podcast directory for it to work?  Will it work without complicated client-side Javascript?  With all the "iTunes U" university stuff out there ( proprietary as it is), will a totally standards-based RSS feed work?

Turns out that it's easier than we expected, although surprisingly, there were few documented examples on the Web showing how simple it really is.  

To make a one-click iTunes subscription link, just link the RSS feed from your Website using a URL with the protocol prefix itpc://   So, a one-click iTunes subscription link to a podcast would look like this:

itpc://www.learningapi.com/rss/podcast.xml

That's all there is to it.   No registration required, no Javascript, nothing special.  This will not enter your feed into the iTunes Music Store's directory, so you won't get rankings, etc from iTunes. For our purposes, which are really intranet-oriented podcasts, we don't want publicity beyond our own user population, so that's a bonus. When a user who has iTunes installed clicks this link, it will automatically subscribe them to the RSS feed as an iTunes podcast.  While it was tempting to explore many of the other RSS one-click options (noticing the Odeo and PodNova options on my local NPR station), we determined that for our users, offering a one-click for  iTunes along with a plain RSS link for manual copy/paste was the sweet spot of user choice and simplicity.  

One other nice touch that's become common for making RSS feeds more friendly to new users - if you click on the RSS feed link (for either podcast or simple RSS news  feeds), it's styled in the browser using XSL, so that it's human-readable, with some helpful instructions for what to do next. 


February 03, 2006

Debugging tools for Web Developers - for Firefox & Internet Explorer

For debugging Web sites and Web applications, there's nothing that beats the tools available in Mozilla Firefox.  Tools like LiveHTTPHeaders, the WebDeveloper toolbar, and others make Firefox a dream for developers.  But if a site is acting up in Internet Explorer, it can be difficult to figure out why - few tools are available and those that are are often a poor substitute for the ones avaialble for Firefox.

Still, there are good tools available to aid development in both browsers. I've put together a short list of  the developer tools I've found useful in each.  This isn't a comprehensive reference, but it's a short-list to the tools that I, as a Web Developer, have come to depend on.

Tool Firefox Internet Explorer
Developer Toolbar The WebDeveloper toolbar is a comprehensive debugging tool. Some highlights:
  • DOM Inspector - walk and manipulate the DOM live, change attributes of page elements live in the browser to see the results. 
  • Live CSS Editing - view and edit all style information for the page live while looking at the page - no reloads, just edit as see the results.
  • Convert from POSTs to GETs and GETs to POSTs - great for seeing the parameters being  passed in a POST request.
  • Outline DIVs, tables, and other block-level elements in a page
  • Shortcuts to clear cache, examine and clear cookies , http authentication, session cookies
  • Turn images off, show image sizes, show alt text
  • Automatic validation of HTML, CSS, links, load time analysis
The IE Developer Toolbar has a limited subset of  Firefox's equivalent, but is still very useful.
  • DOM Inspector similar to Firefox's
  • Display image dimensions, block sizes, class and id information
  • Has a nifty ruler for measuring the exact pixel location (x,y) of any spot on the page, measures distances/offsets between two points, as well. 
  • Validation tools
  • Shortcuts to clear cache, cookies, view cookies.

HTTP Header Inspection
LiveHTTPHeaders - see all the http headers from every request and response.  Great for sniffing out mime types and form submissions.  Has replay mode, lots of configuration options.   ieHTTPHeaders - same idea as LiveHTTPHEeaders, but with fewer bells and whistles.
Image and Color Information ColorZilla's most important feature is the eyedropper tool it adds to Firefox's status bar.  Displays the color (in hex or RGB) of the pixel currently under the mouse cursor on the page.  
Javascript Debugging The Venkman Javascript Debugger is a full-featured debugging environment for Javascript. Includes breakpoints, watches, object inspection, expression evaluation, useful error messages.  If you've used a real code debugger for Java or other languages, you'll be right at home with Venkman. Microsoft Script Debugger is weak, but it's better than nothing. [download here]
Built-in Features
  • Javascript console for viewing all errors and warnings as a page loads and executes.  
  • Full syntax highlighting on View Source 
  • Select part of a page, right click to see View Selection Source to see just that portion of the page's source code.  
DOM Source Viewing Apparently, if you select all of a page and then "View Selection Source", in the source code viewer you will see document.write() generated HTML, but not inserted DOM elements.   Full Source lets you see not just the code as loaded in a page, but lets you see any HTML or script code generated by other code.  All that stuff written to the page with document.write() or createElement() becomes visible.  There are some other useful-looking gizzmos in this package as well, although Full Source is the only one I've used.


January 31, 2006

Contextual Search API from Yahoo - Keyword Extraction for free

I've been playing with some of  Yahoo's search APIs lately.  In particular, I was intrigued by the Content Analysis Service that takes a block of text, along with an optional "helper phrase" to help point to the context of the subject matter, and extracts keywords from it.   I'm always on the lookout for technologies that can help categorize or 'gist'  content.  In particular, the speech-to-text data extracted via voice-recognition from podcasts, videos and lectures is not good enough a transcript to read, but usually is good enough to search.  Is keyword extraction a useful tool for getting the topics from a blob of text?  Try it and see!  

The folks at the BBC certainly found ContextualAnalysis useful for doing research into the connections and relations among public figures and politicans.  Using this service to extract people's names from public documents, the team was able to create "six degrees of separation"-type graphs of "who-knows-whom" (or at least is "associated-with-whom") very quickly and at low cost.  

It took some time to figure out the code for this and get it all to work, but here's an example of it in action.  Here I used the text from my recent post - Digital Asset Management - Some Advice, but you can paste your own in here to try it out.  When you click on Run Query, the data will submit to Yahoo's ContextualAnalysisService via a PHP proxy on my website (to get around cross-domain scripting security restrictions in the browser), and the results will pop up under the form, AJAX-style.  This query uses Yahoo's JSON API, a simple and lightweight protocol for data exchange.  

FORM-BASED PROXY VERSION (any browser)
Helper Phrase:
Text to process:


There's another technique for making these AJAX calls that does not require a proxy - it employs SCRIPT tags dynamically added to the page (inserting DOM elements) with "SRC=" attributes that call the Yahoo API.  The inexplicable problem I found is that this version works in Firefox/Netscape but not in IE.  I'm unable to figure out why, since other sites using the very same code work fine. The SCRIPT element is written to the DOM with a SRC URL which - if I copy and paste it directly into the browser - works. But when I write the SCRIPT element to the page, IE never makes the HTTP call to retreive it. Unfortunately, IE's developer and debugging tools are so poor that it's difficult to find out what's going on.  If anyone has a suggestion, please share it with me.  

Update - thanks to colleague Jeff Griffith at HBS, who discovered that the reason IE is notworking is that the block of text submitted in the form was too long and violated a character limit that IE apparently has for SCRIPT SRC attributes. Shortening the text solved the problem.


SCRIPT TAG VERSION (seems to not work in IE, although it should)
Helper Phrase:
Text to process:


January 26, 2006

Resizing images in the page - a cool technology tip

Last month I wrote about Cucusoft's iPod Video Converter for converting DVDs to iPod video, and included a screenshot of the app in action.  As usual with screenshots, I had to choose between having the image appear full-size and crystal clear, or having it reduced to fit better on the page but have visble and objectionable artifacts from the resizing.  Looking around for a CSS/Javascript solution, I came upon John Berry's elegant solution on his Agile Partners Weblog - a movable slider to resize the image on the fly.   Drag the slider to see it work!

This script uses the Prototype and script.aculo.us javascript libraries.  Scriptaculous builds upon Prototype and adds some amazing DHTMLeffects, including the slider control used in this demo.  You can view source on this page to see the complete code that does this, but in a nutshell, you:
  • download and include the Prototype and Scriptaculous javascript libraries to your page:

    <script src="scripts/prototype.js" type="text/javascript" language="javascript"></script>
    <script src="scripts/scriptaculous.js" type="text/javascript" language="javascript"></script>

  • add the html for the slider and the image file.. Note that you may resize a collection of images by adding additional <DIV class='scale-image'> elements within the enclosing <div>

    <div style="border: 0px solid #ddd; width: 695px; overflow: auto; float:left;">
      <div class="scale-image" style="width: 695px; padding: 10px; float: left;">
        <img src="http://www.emediacommunications.biz/files/cucusoftdvdtoipod.jpg" width="100%"/>
      </div>
    </div>

  • and the HTML for the slider itself
    <div id="track1" style="border:1px solid #BBCCDD; width: 200px; background-image: url('files/scaler_slider_track.gif'); background-repeat: repeat-x; background-position: center left; height:18px; margin: 4px 0 0 10px;">
      <div id="handle1" style="width: 18px; height: 18px;">
      <img src="files/scaler_slider_gray.gif"/>
      </div><a href="#" style="font-size:small; float:right;" onClick="setSlider(1); return false;">[view full-size]</a>
    </div>

  • finally, include the custom resize_slider.js script in your page. This must go after the slider HTML in your page.

    <script type="text/javascript" src="resize_slider.js" ></script>

Here's another great example of this script in action.

January 05, 2006

Prototype - developers' toolkit for AJAX

Here's another great tool for getting started on building AJAX applications - Prototype.  The basic frameworks for AJAX can be remarkably simple to use, although they're not always so simple to figure out at first.  

But a terrific example of how easy AJAX with Prototype can be is at  24Ways.org.   Author Drew McLellan does a great job walking through a simple but elegant example of  easy AJAX in action.  

Many thanks to web developer extraordinaire Leo Haskin for pointing this site out to me!  

Posted by larryb at 06:55 AM [permanent link]
Category: Web and Software Development

January 04, 2006

The Web's Best Javascript, DOM, and CSS Tips & Tricks

One of my colleagues at HBS pointed me at what's possibly the best CSS, Javascript  and DOM reference site I've seen. It also includes hugely useful (and rare) information  about browser compatibility.  Quirksmode.org is the personal project of web developer Peter-Paul Koch.  In his words:

It contains more than 150 pages with CSS and JavaScript tips and tricks, and is one of the best sources on the WWW for studying and defeating browser incompatibilities.

Peter's blog is QuirksBlog, and is a great reference as well.  

Posted by larryb at 06:48 AM [permanent link]
Category: Web and Software Development

November 27, 2005

How to discourage innovation: measure everything

(title borrowed from Idea Festival) Of all the quotes that I came across in my exploration that started at Rod Boothby's Rigid Process can Kill Innovation post on his Innovation Creators blog,  my favorite is this: "process is an embedded reaction to prior stupidity."  (From Ross Mayfield's essay: The End of Process).  Mayfield says:

Because of constant change in our environment, processes are outdated the immediately after they are designed. The 90s business process re-engineering model intended to introduce change, but was driven by experts which simply delivered another set of frozen processes.

The discussion is about innovation, and running an innovative organization.  Boothby addresses the balance between necessary process and empowering standards.

I think it is important to note that a structured environment for supporting innovation, with some process for sharing information and ideas is fine - but those standards are standards of interaction - they are not standards of thought and not standards for what innovative solutions are built

He goes on to reference work by Harvard Business School's  Michael Tushman and  Wharton's Mary Benner  that show how process management programs discourage innovation.  

Process management can drag organizations down and dampen innovation. "In the appropriate setting, process management activities can help companies improve efficiency, but the risk is that you misapply these programs, in particular in areas where people are supposed to be innovative," notes Benner. "Brand new technologies to produce products that don't exist are difficult to measure. This kind of innovation may be crowded out when you focus too much on processes you can measure."

As someone who runs an innovative software development organization, I can attest to the challenge of maintaining balance.  You need enough process to keep the business running, but overall, the innovation comes from highly  talented, informed people working in a relatively process-free environment.  A former boss and mentor recently showed me the body of work her small, innovative team is doing at her new job.  The services and architecture being deployed online are dramatically impacting the entire business of a major institution with over 20,000 employees.  Her comment says it all:

The reason we can do this because we minimize process.

The other lesson to that success is about loose coupling between enterprise applications. I'll be talking about that in my talk at the Gilbane Conference on Content Management in Boston this week. More to come on that...

November 16, 2005

Digital Asset Management - Some Advice

In preparation for my conference session today at the DAM (Digital Asset Management) Symposium, I was asked to summarize my thoughts on what's most important for people who are implementing a content management system to know.  

My personal experience in DAM is mostly centered around building systems for managing streaming media content (search, delivery management, metadata extraction, etc) and other kinds of multimedia materials.  Having done several generations of  a video content management system, as well as several multimedia authoring and asset management systems, here are a few points I think are important:
  1. Buy vs. Build --
    1. Realistically, it's not build vs. buy, it's "build" vs. "buy & build".  These implementations take a great deal of analysis of your business problem, copious customization, and require a strong internal team.  You cannot outsource success.
    2. Integration of a vendor solution can take as long as a custom build.  Be sure your vendor's direction and your implementation will let you take advantage of the vendor's upgrade path, otherwise you may have been better off building.
  2. Plan for change - Don't expect to get everything right in the specification stage.  When you define your business problem and its solution, find the right balance between up-front analysis/specification and leaving room for the system to evolve as its users begin using it.  Follow a path-based development model in which you break the big problem into a bunch of small ones and tackle each incrementally; because;
  3. "If a project team can eat more than two pizzas, it's too large."  This week's Baseline Magazine profiles Amazon.com CTO Verner Vogels and his approach to running Amazon's software development operation.  Small problems are easier to grasp, examine, and solve than big ones.  Small solutions are easier to explain, understand, test, and implement.  Small teams need less process, have few communications challenges, and lower overhead than larger ones.  Small teams can get real work done while large ones are still trying to find common understanding about the problem. 
  4. Retain internal development capacity -- in order to have the system evolve, you need to have internal expertise in modifying it.  
  5. Be ruthless about insisting on the use of open, flexible standards and APIs - Using a system based on open interfaces and standards gives you flexibility to create new things you didn't even dream of when you began.  Information "stovepipes" can be OK...as long as there are simple hooks between them.
More info can be found in the slides from my Gilbane Conference Presentation on content management systems for video and multimedia.

November 10, 2005

Amazon.com's Two-Pizza Team Rule

"If a project team can eat more than two pizzas, it's too large."  This week's Baseline Magazine profiles Amazon.com CTO Verner Vogels and his approach to running Amazon's software development operation.  

Vogels breaks big problems into smaller ones, then assigns tightly focused teams to nail one small problem at a time. As I pointed out in my own Gilbane Conference Keynote presentation  earlier this year, sometimes you really do have a large problem that needs a large team with an expansive view to solve it.  Most often, though, we complicate matters by tackling too big a chunk at once.  

Small teams and tight meetings [are] targeted to solve one or two problems, with challenges cut down to bite-size chunks. Where other retailers might ponder how to improve customer checkout, Amazon shaves layers off the concept and assigns them. One team might work on streamlining gift certificate redemption, another on credit card authorization. All projects take this approach at Amazon.

Vogels upholds the Amazonian principle of "two-pizza teams." That is, technology teams working on a given project typically can be fed by no more than two pizzas—usually eight or fewer people. Small teams are fast, he says, and don't get bogged down in so-called administrivia. 

Each group assigned to a particular business is completely responsible for it. Team members aren't considered database administrators or Java programmers or some other techie title. They're the people responsible for the customer checkout procedure or credit card verification process or search function.

The team scopes the fix, designs it, builds it, implements it and monitors its ongoing use. This way, technology programmers and architects get direct feedback from the business people who use their code or applications--in regular meetings and informal conversations.

There are two parts to this that I think are key.  They are a bit self-evident, but worthy of repeating.
  1. Small problems are easier to grasp, examine, and solve than big ones.  Small solutions are easier to explain, understand, test, and implement .
  2. Small teams need less process, have few communications challenges, and lower overhead than larger ones.  Small teams can get real work done while large ones are still trying to find common understanding about the problem.  
I'm not saying that no one should do large projects -- sometimes you simply must.  But take care to understand the difference between when you must, and when you really don't have to.
Posted by larryb at 05:56 AM [permanent link]
Category: Web and Software Development

November 09, 2005

JSON - Simple, Elegant Data Structures for Javascript/AJAX

I've used this technique recently to develop a project that uses client-side Javascript to handle all of its data and business logic. I didn't know, until reading about it on Anil Dash's excellent treatise on "technologies and techniques are going to be popular in the coming months and into the next year", that this technique had a name.  JSON (JavaScript Object Notation) is an incredibly useful technique for building simple, powerful data structures into Javascript programs.  In this day of AJAX and other powerful ways to build outstanding interfaces and applications using client-side code, JSON is more useful than ever.

In a nutshell, you can define a data object explicitly like this:

var myData = {
    "title" : "In the Bubble: Designing in a Complex World",
    "author" : "John Thackara",
    "chapters" : [
         { "name" : "Lightness", "length" : "32", "finished" : true },
         { "name" : "Speed", "length" : "23", "finished" : true },
         { "name" : "Mobility", "length" : "41", "finished" : false }
         ]
    }

Then, you can access these in your Javascript code like you would an associative array:

myData.title evaluates to
"In the Bubble: Designing in a Complex World"
myData.chapters[0].name evaluates to "Lightness"
or assign var chaps = myData.chapters; and then do chaps[1].length to get "23"

If your data comes to you as a String, you can convert it to a JSON object  with 
var myObject = eval('(' + aJSONtext + ')');

Posted by larryb at 05:44 AM [permanent link]
Category: Web and Software Development

May 10, 2005

Big Mess on Campus: ERP vs. Loose Coupling
(with another AJAX example: Zuggest (amazon.com lookup)

CIO Magazine's Thomas Wailgum writes in Big Mess on Campus about how difficult it is to do ERP - especially in  educational institutions:

These recent campus meltdowns illustrate how the growing reliance on expensive ERP systems has created nightmare scenarios for some college CIOs. [ ... ]  They drool over the integrated views that an ERP system offers of finance, HR, student records, financial aid and more.

But those same officials often fail to see the enormous cultural and technical obstacles that can delay—and even cripple—such ambitious implementations. A recent survey found that university ERP implementations have taken far longer than expected and cost five times more than what the projected price tag was. "There are a lot of people who have scar tissue" from ERP failures, says Bob Weir, vice president of IS at Northeastern University—including himself.

The article goes on to detail ERP disasters are UMass, Standford, and others.

Big comprehensive systems are hard to do and usually result in inflexible, set-in-stone-for-years systems.  The sweet spot is with systems that couple loosely - either at the middleware layer (i.e. using  a custom J2EE web app fronting for disparate backend systems that contain the data of record) or at the client layer (using open http/Javascript/XML APIs).  How about this example pulling Amazon results directly into the web page as you type...Zuggest.

Posted by larryb at 08:53 AM [permanent link]
Category: Web and Software Development

May 06, 2005

Spontaneous Integration - simple Library Lookup from Amazon.com

If you've been following Jon Udell's Spontaneous Integration columns, you know about the Library Lookup Project.  It provides a bookmarklet - a bookmark that runs a local script in your browser - that does a lookup at your local library of the book you're viewing on Amazon.com.  

It has to be customized to your library's query format.  While there's a long list of preconfigured scripts for hundreds of libraries on Jon's site, my own local library network was not  among them.  So for anyone who needs it, here is the LibraryLookup bookmarklet for the SAILS Library Network, which covers 51 public libraries on Boston's South Shore - including Mansfield, Norton, Attleboro, New Bedford, Foxboro, Middelboro, Wareham, and others.  

Just drag this link (SAILS Library Network) up to your Personal Toolbar.  Then when you're viewing a single book title on Amazon, click it to see the book's availability in the local library.

Posted by larryb at 08:58 AM [permanent link]
Category: Web and Software Development

April 15, 2005

Organic content management

To see the biggest benefits of Enterprise Content Management, don't try to do Enterprise Content Management.   While many vendors and speakers spoke of eliminating stovepipes of information within companies, in my Content Technology Works keynote at the Gilbane Conference on Content Management this week, I advocated something different.  

There are two approaches to ECM...one is the big-system-installation model where a large deployment serves all users across the enterprise.  This model addresses the needs of everyone with careful requirements collection and coordination amongst the many constituents of a content system -- across departments from marketing to engineering, from public relations to customer support.  Some businesses need this end-to-end approach -- the top-down, comprehensive structuring that can solve compliance-related ECM problems.

The other model is one that was further suggested in a brief hallway conversation I had with Jon Udell on the topic.  That approach involves an organic, modular approach to managing content.  The idea is that building stovepipes is OK - provided that they include connection points -- technical interfaces based on open, available standards that allow innovation outside the tools themselves.  People with needs will come up with amazing innovative ways to use, link, and engage your content  if you leave them the freedom and "hooks" to do so.   Significantly, these smaller projects will also be easier, less costly, and less risky to develop.

Case in point, take two classic stovepipes of information on the public internet: Google Maps and Craig's List's real estate listings.  Paul Rademacher has cooked up something totally new from open interfaces to these two apps.  By scraping Craig's List for real estate rental and sales listings, then employing Google's maps API, he's built a graphical interface to an area's available housing. Browse on the map, view photos, visit listings...it's an amazing implementation of something truly useful.  To restate the obvious here - this incredibly useful integration was done by an internet user who's not associated with either Google nor Craigslist (echoed in Udell's software as a service Infoworld article).

Now imagine this dynamic working within your company.  Sufficiently open, scriptable interfaces, combined with creative internal development capacity, equals innovation and utility beyond what you'd ever have designed from the top down.  

Posted by larryb at 05:41 PM [permanent link]
Category: Web and Software Development

April 03, 2005

The mis-uses of Flash

Flash is an amazing tool for creating stunningly effective user interfaces, particularly where rich media and video are involved.  But nothing is more frustrating than coming upon a site that ignores principles of good web design and web standards, often to the sole purpose of showing off the author's ability in Flash.  

These misguided Flash developers defeat themselves and their customers by flaunting the very principles that make the Web work so well for so many.  I visited a site for a vendor that sells Flash extensions for Flash developers to include in their projects.  The site is done entirely in Flash - no HTML at all.  I encountered five immediate frustrations that led me not to buy and to write this rant instead.
  • It makes me wait and watch while it progressively draws itself.  Very artsy, but anti-user.  Why does the author think I want to wait for him to load the page, unnecessarily slowly?  
  • I can't use Firefox control-click to open links in a new tab.  I like tabs, I use tabs, and this author decided that when I view his site, I can't use tabs.  Bye-bye.
  • I can't use browser controls to increase font size.
  • I cannot use standard keyboard shortcuts to select links or to scroll the page.  
  • Back button rendered useless -- now I have to rely entirely on the site author's idea of how I should navigate the site.  
As a final self-defeating point, this site is completely invisible to the search engines.  I can't find it in Google, even when I  type in the exact name of the product, the Google search comes up empty.  Flash is opaque to crawlers!

I've become a convert to Flash for rich media, but for standard web design, don't try to reinvent the function of the browser in Flash.  You can't improve the usability, only narrow your users' choices.  According to DoubleClick, over the last year, online users are spending 10% less time on each site, yet looking at 34% more pages of the site during each visit.  What this means to site designers is, don't look for ways to piss off your visitors.  They'll take their browsing, and their business, elsewhere.

Posted by larryb at 11:07 PM [permanent link]
Category: Web and Software Development

January 08, 2005

Automated Batch Exporting of Flash .fla files


Macromedia Flash is fast becoming a major platform for web development.  My own group is developing not just content, but tools and even authoring environments using Flash.  But one area where Flash lags far behind is in development process management.  

In particular, my group recently had to re-Export about 600 Flash .fla files while rebuilding a large project.  Unfortunately, Macromedia provides no command-line interface, no batch processing mode that will allow automated, managed build processes such as Apache's ant, which is a staple in most standard software development groups.

A search on the Web and among colleagues found one promising tool - FlashCommand.  This command-line interface to Flash seemed to offer a way to Export Flash via a command script.  Unfortunately, in practice, this didn't work - it launches the Flash GUI, where the export operation stalls on a modal dialg box waiting for input that will never come.  

My solution - it turned out to be fragile but workable for a one-off kind of need.  AutoIt V3 is a scripting tool that allows scripting the Windows GUI.  It depends a lot on window titles and on timing - even the speed of your system can affect the script you'll write for certain operations - but it worked and turned the export of 600+ .fla files into a short, unattended operation.  The script that worked is in the extended part of this posting (click the continue link below"), for those who could benefit from it. 

This is no substitute for a real command-line mode and real 'ant' integration for managing builds, as is so essential in busy development shops doing complex projects. Macromedia needs to step up to the plate and make professional-grade development tools available if it wants to continue to drive towards playing in the "big leagues".

Here's the AutoIt script code. It assumes that all your .fla files are in c:\tmp, and that you're using Flash MX 2004 Professional. You can easily edit the script for different versions or file locations. Just save this text in a file named [something].au3, and then call that file from the Windows command-line.
;code begins here
; Shows the filenames of all files in the current directory
$search = FileFindFirstFile("*.fla") 

; Check if the search was successful
If $search = -1 Then
    MsgBox(0, "Error", "No files/directories matched the search pattern")
    Exit
EndIf

While 1
    $file = FileFindNextFile($search)
    If @error Then ExitLoop
  ;c:\tmp in the following line can be edited to your choice of file location
  Run("C:\Program Files\Macromedia\Flash MX 2004\flash.exe c:\tmp\"& $file)
  WinWaitActive("Macromedia Flash MX")
 ; send export command
 Send("^!S")
 WinWaitActive("Export Movie")
; set new filepath and name
 $file = StringTrimRight($file, 4)
 Send("C:\tmp\" & $file & ".swf")
 Send("{ENTER}")
 WinWaitActive("Export Flash Player")
; dismiss export settings dialg
 Send("{ENTER}")
; if you're not using MX 2004 Pro, edit the following line to match the Window title of your version
 WinWaitActive("Macromedia Flash MX Professoinal - [" & $file)
; close .fla file
 Send("^w")
 Send("n")
WEnd

; Close the search handle
FileClose($search)

December 22, 2004

Javascript to stop animated GIFs?

I'm working with a complete rebuild of the HBS Videotools online video content management system. We're using keyframes pulled from Virage logging of the video to build an animated gif that cycles through a few video frames -- this appears in search results and on video details pages. The problem is that if you have a lot of these in your results, the page can be a bit "overactive" to view comfortably.

So....what I'm looking for is a way to let the user stop the animations at will. Pressing the Escape key does it. Programmatically I can do it with Javascript by issuing a window.stop() command, but that only works on Netscape/Mozilla/Firefox based browsers. Is there a way to accomplish the same thing for IE users? Email me if you have a solution - I'll post again if I find anything that works.

Posted by larryb at 09:37 AM [permanent link]
Category: Web and Software Development

March 25, 2004

Mozilla ActiveX causes problem for Flash - but here's the fix

The current version of the Mozilla Windows Media ActiveX control has a quirk in its default configuration.  If you've installed it, you may find that links in Flash animations don't work anymore. 

The ActiveX control plugin lets Netscape 7.1+ or Mozilla embed the Windows Media Player 7 or 9 ActiveX control using the OBJECT tag.  (see The Windows Media ActiveX Control – Not Just for Internet Explorer Anymore)  Previously, the Windows Media 6.4 plugin was the only way to use WM in browsers other than Internet Explorer.  The problem is that with ActiveX support installed, Mozilla loads the Flash ActiveX control rather than the Flash plugin.  There's a bug in the Flash ActiveX<->Mozilla connection that makes links from Flash non-functional. 

The solution: configure Mozilla's ActiveX support to work only with Windows Media - its intended purpose.  To do this:
  1. Find the activex.js file in C:\Program Files\mozilla.org\Mozilla1.6\defaults\pref
  2. Change the word true to false in the following line:
    pref("security.classID.allowByDefault", false); 
    This pref sets the default policy to allow all controls or deny them all default.
  3. add the line
    pref("capability.policy.default.ClassID.6BF52A52-394A-11d3-B153-00C04F79FAA6", "AllAccess");
Now save that file and you're done.  Windows Media 9 will still work (callbacks and all), and Flash will work as it should via the (fully-functional) plugin.  For Netscape 7.1 users, this configuration is the default, so you won't have to do anything.

March 24, 2004

Every Customer Counts

In a recent streamingmedia.com article about using the Windows Media ActiveX control in Mozilla, I said:

Powerful new browsers like Apple's Safari and the Gecko-based Netscape/Mozilla family are enticing users with their seductive features and blazing performance. Thankfully, the widespread adoption of standards by all the browsers has made it relatively simple matter to write Web sites that work seamlessly on any browser and operating system.

Typically only novice developers create IE-only Web pages. Coding to standards is just plain smart, as well as being good for our industry. Don't forget - our industry literally exists because of the open standards upon which the Web was built.

It's nice to see the sentiment echoed, more eloquently, by Jim Rapoza in eWeek: "Every Customer Counts"

Also guilty are the many Web sites that either work best—or only—with Microsoft's Internet Explorer browser.

This is laziness, pure and simple. There is no feature developed for IE that cannot also be developed using open standards. And for Web site developers and operators, open standards provide a whole host of integration benefits beyond customer inclusion—and at no additional cost.

He's right - if a product or service I'm evaluating doesn't function cross-platform and cross-browser, that's a show-stopper.   Being locked in benefits the vendor, but does nothing for me except restrict my choice to run my business on whatever tools I want to.  To give up my choices for strategic reasons is one thing; to give them up because the vendor is either ingorant or lazy is irresponsible.

Why use anything other than IE?  Here's 15 reasons why Mozilla is more productive and more fun (and narrowing it to 15 was hard!).

Posted by larryb at 07:21 AM [permanent link]
Category: Web and Software Development

March 08, 2004

Google's Secrets of Fostering Innovation in Technology Development


"An IT organization running at full throttle all the time is ultimately self-defeating"

Chad Dickerson, in last week's Infoworld column, made the above observation while recounting the five basic principles that drive Google's technology development:
  • Work on things that matter
  • Affect everyone in the world
  • Solve problems with algorithms where possible (automate everything)
  • Hire bright people and give them lots of freedom
  • Don't be afraid to try new things
In particular, Chad notes that:

As a general practice, Google also requires that its engineers spend 20 percent of their time working on personal technology projects unrelated to their primary projects.  ...  I think hiring bright people and giving them freedom is a required element of an innovative organization, one that implicitly supports trying new things.

I've seen some of these at Harvard Business School since its transformational IT Initiative 1996.  It's not always easy to stay on the path - balancing innovative freedom with operational rigor - but it pays big dividends to try. 

Posted by larryb at 06:55 AM [permanent link]
Category: Web and Software Development