Component Challenges
The Ajax model of Web development can provide a much richer and more seamless user experience
than the traditional full-page-refresh, but it poses some difficulties as well. At its fundamentals,
Ajax entails using the browser’s client-side scripting capabilities in conjunction with its DOM to
achieve a method of content delivery that was not entirely envisioned by the browser’s designers.
(Perhaps this hack-like nature of Ajax lends to its appeal.) However, this subjects Ajax-based
applications to the same browser compatibility issues that have plagued Web designers ever since
Microsoft created Internet Explorer. For example, Ajax engines make use of an XMLHttpRequst
object to exchange data asynchronously with remote servers. In Internet Explorer 6, this object is
implemented with ActiveX rather than native JavaScript, which requires that ActiveX be enabled.
A more fundamental requirement is that Ajax requires that JavaScript be enabled within the user’s
browser. This might be a reasonable assumption for the majority of the population, but there are
certainly users who use browsers or automated tools that either do not support JavaScript or do
not have it enabled. One such set of tools are the robots, spiders, and Web crawlers that aggregate
information for Internet and intranet search engines. Without graceful degradation, Ajax-based
mashup applications might find themselves missing out on both a minority user base as well as
search engine visibility.
The use of JavaScript to asynchronously update content within the page can also create user
interface issues. Because content is no longer necessarily linked to the URL in the browser’s
address bar, users might not experience the functionality that they normally expect when they use
the browser’s BACK button, or the BOOKMARK feature. And, although Ajax can reduce latency by
requesting incremental content updates, poor designs can actually hinder the user experience,
such as when the granularity of update is small enough that the quantity and overhead of updates
saturate the available resources. Also, take care to support the user (for example, with visual
feedback such as progress bars) while the interface loads or content is updated.
As with any distributed, cross-domain application, mashup developers and content providers alike
will also need to address security concerns. The notion of identity can prove to be a sticky subject,
as the traditional Web is primarily built for anonymous access. Single-signon is a desirable feature,
but there are a multitude of competing technologies (ranging from Microsoft Passport to the Liberty
Alliance), thus creating disjointed identity namespaces that you must integrate as well. Content
providers are likely to employ authentication and authorization schemes (which require the notion
of secure identity or securely identifiable attributes) in their APIs to enforce business models that
involve paid subscriptions or sensitive data. Sensitive data is also likely to require confidentiality
(that is, encryption), and you must take care when you mash it with other sources to not put it at
risk. Identity will also be crucial for auditing and regulatory compliance. Additionally, with data
integration happening both on the server and client-side, identity and credential delegation from
the user to the mashup service might become a requirement.
Social Challenges
In addition to the technical challenges described in the previous section, social issues have (or will)
surface as mashups become more popular. |