Monday, June 28, 2004

Browser NG: Some weird thoughts

Hmm.. so I finally got around to start using the firefox 0.9 last week. The requirement came due to some issues that I encountered with Internet Explorer(see below for more information) and whether it was a Browser issue. Any way I have shifted completely to this new browser but so far I have not been able to find the wow factor for this tool so far. Well as a self-proclaimed geek I do like some things like DOM Inspector and the Java Script Console (even though a small icon at the bottom telling that there was error would make more sense), I see the features as just an extension of basic browser facilities of rendering the content generated by the content provider. Isn't it time that we gave the control over to content redering to client/user? This control can start with simple things like
  • Setting Page Properties Allowing Client to set the automatic refresh on a page which does not have a REFRESH Tag embeded to it(will this be construde as denial of service attack?)
  • Web Site Macro Recording and playing Record the sequence of events on the browser for a user and then automating that sequence (obviously incase of any problem, the user should be able to take over) when user accesses the web site. This ensures that you directly go to the page that you want to and not be bothered by things that are of no use to you.
  • Web Site Search Now that you have reached a site, would it not make more sense that the browser automatically searches the site for all the relevant pages and when you scroll over to a link tell you whether that is the best bet or even skip the pages to go to page with best match corresponding to a link(obviously having such a search capability at your fingertips would require really cheap and highspeed network access or the search engines could provide such capabilities which may already be there). And may be even have a sidebar with list of sites that closely match the content of the site that you are on (such a sevice will come from a search engine like google or yahoo or next big search engine).
  • Web Services Support With the growing acceptance of the webservices as one of the interface for providing service, it would be interesting if the browsers would provide the capability of rendering the Web Service interface as User Interface which can be used to invoke the web service and the response can then be rendered by either client side XSL or service provider side XSL.

Well incase we are comfortable with this level of customization, I think we have the existing technology to build Client Side Portals. I remember that around 99-00, Microsoft had this idea of active desktop which was solved as some sort of push or pull model system. The idea is similar to it but goes ahead and gives complete control to customers. The idea being that customers will be able to create their intelligent browser skins (for text data equivalent to CSS) which is customizable for rendering the information based on the space allocated for the site in the window and also tracks the users browsing habit(7/4/04:something similar), privacy concern incase of compromize of machine, and customizes itself to make better decisions about what user is trying to do on the web (well it may start by building stupid intelligence into the product and then we can go from there with continuous feed back from the users). I know I sound very naive to raise it, but is time to make the protocols like RSS standard on the web for content publishing in conjunction with HTML. This basically converts the entire web into one big blogosphere and the RSS Readers can be the next generation browsers with the improved skins to customize the client side portal with all the information that is needed. But is looking at the entire internet as blogosphere a right way to interpret it. May be it is a programmers' view of the world (remember the last time your manager was happy with the website immediately after web designer had updated the look and feel of the site after you had completed the functionality as if functionality was nothing in comparision to look and feel) of being able to access the content without getting bogged down by the look and feel components which reduce the content area. But I have a feeling that the concept of skin, if implemented well, will be accepted by the standard user who wants to give the personal touch to the whole browsing experience similar to for example choosing the colors to paint their house. But would that mean there would be no free sites (because the free content provider try to entice you to be on the website for longer time by developing the look and feel to achieve that and thus providing them with more opportunity to generate revenue through you)? May be that would be the case and content preview sites will come into the being which will agregate content (and will be subscribed by clients to decide what they want) from variety of providers and would allow you to decide which content you want to subscribe to and then you can use micro-payment for paying the content provider on per use basis. At the same time just like mordern bloggers and other people looking for non-tangible assets like fame, popularity or for altruistic reasons, will continue to provide content for free. But in absence of content provider funded art, how will the artist/mordern web designer survive? I do not know may be by directly selling their services and ideas to people who want to customize their portal to their liking in a way similar to how interior decorator sell their services to the clients. This whole thing seems too far fetched to be happening in next few years, but my feeling is that the user has to take back the control over the content and should be able to get the stuff that he is looking for.

IE Problem

Now the problem with IE which I noticed in version 5.5, 6.0 SP1 is as follows.

  1. Create a website on a server that supports Integrated Windows Authentication using Kerberos (IIS in W2k domain will be the simplest).
  2. In this site protect one of the pages with SPNEGO while leaving the other page to have anonymous access (well this is to simulate different applications with different authentication requirements)
  3. Now if you try to access the URL protected by IWA from a machine where you have loggedin locally (i.e. not in domain), you should get a 401.1 or 401.3 error.
  4. Now to increase the fun (or what my boss calls increasing usability of site), replace these default error pages with your own error page which redirect the person to a page not protected by IWA. And this page should post data to itself and show the posted data to simulate password based login process(simple ASP should do the trick).
  5. Go ahead and try to access the protected page and then submit the data to IIS.
  6. If you see that no data is POST you have succesfully simulated the problem.
  7. Now change the redirect URL in error page to a directory instead of ASP file and make ASP file a default page. Try protected URL again and post the data.
  8. Viola!! you should now start seeing the posted data.

Could not simulate the problem in FireFox 0.9(completely ignored the SPNEGO) or Opera 7.0 (just gave an error) since they do not seem to support SPNEGO or is it NTLM (can't be sure because not logged in to domain). Update 7/13 I have been getting more hits on support for SPNEGO in Firefox and thought that I will add the information for the people who look for the same on to this site. The Integrated Windows authentication can be performed by using NTLM authentication and Kerberos ticket based Authentication (Win2K and above). The FireFox supports the NTLM and so if you enable the Integrated Windows Authentication on IIS or corresponding authentication scheme on any other server, the server will send Negotiate and NTLM in supported authentication mechanisms. On XP (the platform I tested on) you will get a prompt for ID and password which will then be used to perform the NTLM authentication by FireFox. You can not disable the prompt for user id and password(not that I know of). At the moment there does not seems to be support for Negotiate Authentication scheme (Kerberos ticket based authentication like Internet Explorer) and thus provides a more user friendly reduced sign on solution. Another thing to keep in mind is that if the server besides IIS is used make sure that the server sends the NTLM header along with Negotiate header. If it sends only the Negotiate header (some of the SSO products send only the Negotiate header) the FireFox will not switch to NTLM for authentication and display the page returned by the server. Hope this helps!!

Tuesday, June 01, 2004

Standardization of XML or XMLization of standards and Services

Well the title kind of gives the basic idea behind these thoughts i.e. what are the flood of XML based standards trying to achieve. The way I got initial understanding of the whole set of standard is
  • XML is good (mkay),
  • it is required to interoperate(mkay),
  • previous attempt at making computer talk to each other was bad(mkay)
  • Web Services are good and we can talk to microsoft(mkay)
But looking at it that way does not make a lot of sense to me.

XML is same

XML is NOT the solution to interoperability issue. period!!! As far as I understand the idea of interoperability is to make two machine talk same language and has nothing to do with making the protocol readable to human. The last time I checked my computers were still talking using a wierd language under the hood called binary. So what does XML has to do with interoperability issue and what was missing from DCE and CORBA that would not allow them to solve the issue. Let us look at the whole idea of making computers talk to each other. Most people agree that the computers talk to each other using protocols which is basically a language with its own word set. Now in order to allow two applications running on different machines the following approaches have been tried.
  • Same Code on both machine: which is what DCE tried to achieve. It had its own issue of not being available out-of-box on Mainframes (I remember hearing about DCE version on OS/390 from IBM which was very basic) and Windows. Besides that it was realized that porting the code to different environment itself resulted in a set of interoperability issues.
  • May or may not be same code but a very well defined protocol: Something that was tried with CORBA using binary tags and bunch of standards developed around it. Well implementation issues and associated inter-operability issues made sure the implementation from different vendors could not talk to each other.
So it seems that having standards do not solve the interoperability atleast in software world. Why? One reason could be that open standards are too open and providing a lot of leaverage to software implementations which results in interoperability issues(something that we can see with most of Web Service Standards). Another possible issue that seems to be important is that the standards follow the products and not vice-versa. This means that products come out with new features and then standards try to create specifications for the similar features provided by competing vendors resulting in a standard that is too late to give any incentive to vendors to interoperate. Hence any thoughts of establishing standards for low level product interoperability (like RPC, Message transport protocol) is meaningless. Then what is it that we are looking for in terms of standards that will help us achieve interoperability. We should be looking at Business Service Standards to help us achieve interoperability. This is what we should try to achieve via Service Oriented Architecture otherwise the whole SOA is of no use and will only result in development of another interface in front of business logic besides mainframe console applications, DCE, CORBA, HTML

XML is different

Then why should we bother with XML and the whole Web Service with WS-* and bunch of OASIS standards. Well looking at these pool of standards we have to realize that these are the foundation blocks on which we will be building these Business Service standards and these standards are just means to the end. In that sense we could have used any other low level open standards (like DCE, CORBA or DCOM if it was open). So how is XML different? One thing that we have to realize is that eventhough the interroperability is achieved between the machines, the standards are developed by human. So if you wanted the business groups of companies to get together and chalk out a Business Service standard at a global level, allowing them to define these protocols in human language(like english) would be more appropriate instead of binary tags. This means that the HR managers can get together and defines what are the Business Services they require to be automated, define the format of data that should be send around as protocols and standards. These standards can then be implemented by any product that must provide the corresponding HR Services. So now you do not have to develop the interfaces in-house using Java, .NET and instead you can buy the Business service interface as a commodity which is basically the code to provide interface. This code come with the application servers or as libraries from software companies focused on vertical segments or combination thereof. The developer would be require to write the pure business logic behind the service (which most of the time would already be present). Even though it would be better to have these standards define the aspects of service (like transport layer, Authentication level, Encryption level and so on), I think we may have to evaluate that during implementation and will require the developers to configure these Aspects of the protocol. Even though you can see some of such service standards taking shape on OASIS, I am not sure how much impetus we have from vendors. Most of the time I see them fighting over standards for low level services and not providing enough help to business owners to get to the next steps of standards i.e. business standards. Last week it seems like BEA started something on that front with TMC SOA blue print but I am not sure whether it will go any further until it gets backing from the organization devoted to the Business (like accounting or HR)

Road Ahead

Now given the time it takes to get the basic protocols itself to become standards, any thought of having Services Standards seems far away into the future. I get a feeling that we will loose this battle of standardization again. There is already talks about how XML eats into the CPU cycles and network bandwidth during (de)serialization and transport and how we should go back to good old days of binary protocols. These people have to understand that XML though not the best way to talk between machine, may be the best chance to make humans agree on standards. Also we should begin working on another set of standards on converting XML standards to a protocol that will suite computer more than human i.e. Binarization of XML standards This may be our only chance to get out of the endless technological cycles of standardization and take standards to a logical end. PS: Hmm...a prophetic dooms day ending makes it quite entertaining I hope even though it does not make any sense.