Monday, June 28, 2004

Browser NG: Some weird thoughts

Hmm.. so I finally got around to start using the firefox 0.9 last week. The requirement came due to some issues that I encountered with Internet Explorer(see below for more information) and whether it was a Browser issue. Any way I have shifted completely to this new browser but so far I have not been able to find the wow factor for this tool so far. Well as a self-proclaimed geek I do like some things like DOM Inspector and the Java Script Console (even though a small icon at the bottom telling that there was error would make more sense), I see the features as just an extension of basic browser facilities of rendering the content generated by the content provider. Isn't it time that we gave the control over to content redering to client/user? This control can start with simple things like
  • Setting Page Properties Allowing Client to set the automatic refresh on a page which does not have a REFRESH Tag embeded to it(will this be construde as denial of service attack?)
  • Web Site Macro Recording and playing Record the sequence of events on the browser for a user and then automating that sequence (obviously incase of any problem, the user should be able to take over) when user accesses the web site. This ensures that you directly go to the page that you want to and not be bothered by things that are of no use to you.
  • Web Site Search Now that you have reached a site, would it not make more sense that the browser automatically searches the site for all the relevant pages and when you scroll over to a link tell you whether that is the best bet or even skip the pages to go to page with best match corresponding to a link(obviously having such a search capability at your fingertips would require really cheap and highspeed network access or the search engines could provide such capabilities which may already be there). And may be even have a sidebar with list of sites that closely match the content of the site that you are on (such a sevice will come from a search engine like google or yahoo or next big search engine).
  • Web Services Support With the growing acceptance of the webservices as one of the interface for providing service, it would be interesting if the browsers would provide the capability of rendering the Web Service interface as User Interface which can be used to invoke the web service and the response can then be rendered by either client side XSL or service provider side XSL.

Well incase we are comfortable with this level of customization, I think we have the existing technology to build Client Side Portals. I remember that around 99-00, Microsoft had this idea of active desktop which was solved as some sort of push or pull model system. The idea is similar to it but goes ahead and gives complete control to customers. The idea being that customers will be able to create their intelligent browser skins (for text data equivalent to CSS) which is customizable for rendering the information based on the space allocated for the site in the window and also tracks the users browsing habit(7/4/04:something similar), privacy concern incase of compromize of machine, and customizes itself to make better decisions about what user is trying to do on the web (well it may start by building stupid intelligence into the product and then we can go from there with continuous feed back from the users). I know I sound very naive to raise it, but is time to make the protocols like RSS standard on the web for content publishing in conjunction with HTML. This basically converts the entire web into one big blogosphere and the RSS Readers can be the next generation browsers with the improved skins to customize the client side portal with all the information that is needed. But is looking at the entire internet as blogosphere a right way to interpret it. May be it is a programmers' view of the world (remember the last time your manager was happy with the website immediately after web designer had updated the look and feel of the site after you had completed the functionality as if functionality was nothing in comparision to look and feel) of being able to access the content without getting bogged down by the look and feel components which reduce the content area. But I have a feeling that the concept of skin, if implemented well, will be accepted by the standard user who wants to give the personal touch to the whole browsing experience similar to for example choosing the colors to paint their house. But would that mean there would be no free sites (because the free content provider try to entice you to be on the website for longer time by developing the look and feel to achieve that and thus providing them with more opportunity to generate revenue through you)? May be that would be the case and content preview sites will come into the being which will agregate content (and will be subscribed by clients to decide what they want) from variety of providers and would allow you to decide which content you want to subscribe to and then you can use micro-payment for paying the content provider on per use basis. At the same time just like mordern bloggers and other people looking for non-tangible assets like fame, popularity or for altruistic reasons, will continue to provide content for free. But in absence of content provider funded art, how will the artist/mordern web designer survive? I do not know may be by directly selling their services and ideas to people who want to customize their portal to their liking in a way similar to how interior decorator sell their services to the clients. This whole thing seems too far fetched to be happening in next few years, but my feeling is that the user has to take back the control over the content and should be able to get the stuff that he is looking for.

IE Problem

Now the problem with IE which I noticed in version 5.5, 6.0 SP1 is as follows.

  1. Create a website on a server that supports Integrated Windows Authentication using Kerberos (IIS in W2k domain will be the simplest).
  2. In this site protect one of the pages with SPNEGO while leaving the other page to have anonymous access (well this is to simulate different applications with different authentication requirements)
  3. Now if you try to access the URL protected by IWA from a machine where you have loggedin locally (i.e. not in domain), you should get a 401.1 or 401.3 error.
  4. Now to increase the fun (or what my boss calls increasing usability of site), replace these default error pages with your own error page which redirect the person to a page not protected by IWA. And this page should post data to itself and show the posted data to simulate password based login process(simple ASP should do the trick).
  5. Go ahead and try to access the protected page and then submit the data to IIS.
  6. If you see that no data is POST you have succesfully simulated the problem.
  7. Now change the redirect URL in error page to a directory instead of ASP file and make ASP file a default page. Try protected URL again and post the data.
  8. Viola!! you should now start seeing the posted data.

Could not simulate the problem in FireFox 0.9(completely ignored the SPNEGO) or Opera 7.0 (just gave an error) since they do not seem to support SPNEGO or is it NTLM (can't be sure because not logged in to domain). Update 7/13 I have been getting more hits on support for SPNEGO in Firefox and thought that I will add the information for the people who look for the same on to this site. The Integrated Windows authentication can be performed by using NTLM authentication and Kerberos ticket based Authentication (Win2K and above). The FireFox supports the NTLM and so if you enable the Integrated Windows Authentication on IIS or corresponding authentication scheme on any other server, the server will send Negotiate and NTLM in supported authentication mechanisms. On XP (the platform I tested on) you will get a prompt for ID and password which will then be used to perform the NTLM authentication by FireFox. You can not disable the prompt for user id and password(not that I know of). At the moment there does not seems to be support for Negotiate Authentication scheme (Kerberos ticket based authentication like Internet Explorer) and thus provides a more user friendly reduced sign on solution. Another thing to keep in mind is that if the server besides IIS is used make sure that the server sends the NTLM header along with Negotiate header. If it sends only the Negotiate header (some of the SSO products send only the Negotiate header) the FireFox will not switch to NTLM for authentication and display the page returned by the server. Hope this helps!!

Tuesday, June 01, 2004

Standardization of XML or XMLization of standards and Services

Well the title kind of gives the basic idea behind these thoughts i.e. what are the flood of XML based standards trying to achieve. The way I got initial understanding of the whole set of standard is
  • XML is good (mkay),
  • it is required to interoperate(mkay),
  • previous attempt at making computer talk to each other was bad(mkay)
  • Web Services are good and we can talk to microsoft(mkay)
But looking at it that way does not make a lot of sense to me.

XML is same

XML is NOT the solution to interoperability issue. period!!! As far as I understand the idea of interoperability is to make two machine talk same language and has nothing to do with making the protocol readable to human. The last time I checked my computers were still talking using a wierd language under the hood called binary. So what does XML has to do with interoperability issue and what was missing from DCE and CORBA that would not allow them to solve the issue. Let us look at the whole idea of making computers talk to each other. Most people agree that the computers talk to each other using protocols which is basically a language with its own word set. Now in order to allow two applications running on different machines the following approaches have been tried.
  • Same Code on both machine: which is what DCE tried to achieve. It had its own issue of not being available out-of-box on Mainframes (I remember hearing about DCE version on OS/390 from IBM which was very basic) and Windows. Besides that it was realized that porting the code to different environment itself resulted in a set of interoperability issues.
  • May or may not be same code but a very well defined protocol: Something that was tried with CORBA using binary tags and bunch of standards developed around it. Well implementation issues and associated inter-operability issues made sure the implementation from different vendors could not talk to each other.
So it seems that having standards do not solve the interoperability atleast in software world. Why? One reason could be that open standards are too open and providing a lot of leaverage to software implementations which results in interoperability issues(something that we can see with most of Web Service Standards). Another possible issue that seems to be important is that the standards follow the products and not vice-versa. This means that products come out with new features and then standards try to create specifications for the similar features provided by competing vendors resulting in a standard that is too late to give any incentive to vendors to interoperate. Hence any thoughts of establishing standards for low level product interoperability (like RPC, Message transport protocol) is meaningless. Then what is it that we are looking for in terms of standards that will help us achieve interoperability. We should be looking at Business Service Standards to help us achieve interoperability. This is what we should try to achieve via Service Oriented Architecture otherwise the whole SOA is of no use and will only result in development of another interface in front of business logic besides mainframe console applications, DCE, CORBA, HTML

XML is different

Then why should we bother with XML and the whole Web Service with WS-* and bunch of OASIS standards. Well looking at these pool of standards we have to realize that these are the foundation blocks on which we will be building these Business Service standards and these standards are just means to the end. In that sense we could have used any other low level open standards (like DCE, CORBA or DCOM if it was open). So how is XML different? One thing that we have to realize is that eventhough the interroperability is achieved between the machines, the standards are developed by human. So if you wanted the business groups of companies to get together and chalk out a Business Service standard at a global level, allowing them to define these protocols in human language(like english) would be more appropriate instead of binary tags. This means that the HR managers can get together and defines what are the Business Services they require to be automated, define the format of data that should be send around as protocols and standards. These standards can then be implemented by any product that must provide the corresponding HR Services. So now you do not have to develop the interfaces in-house using Java, .NET and instead you can buy the Business service interface as a commodity which is basically the code to provide interface. This code come with the application servers or as libraries from software companies focused on vertical segments or combination thereof. The developer would be require to write the pure business logic behind the service (which most of the time would already be present). Even though it would be better to have these standards define the aspects of service (like transport layer, Authentication level, Encryption level and so on), I think we may have to evaluate that during implementation and will require the developers to configure these Aspects of the protocol. Even though you can see some of such service standards taking shape on OASIS, I am not sure how much impetus we have from vendors. Most of the time I see them fighting over standards for low level services and not providing enough help to business owners to get to the next steps of standards i.e. business standards. Last week it seems like BEA started something on that front with TMC SOA blue print but I am not sure whether it will go any further until it gets backing from the organization devoted to the Business (like accounting or HR)

Road Ahead

Now given the time it takes to get the basic protocols itself to become standards, any thought of having Services Standards seems far away into the future. I get a feeling that we will loose this battle of standardization again. There is already talks about how XML eats into the CPU cycles and network bandwidth during (de)serialization and transport and how we should go back to good old days of binary protocols. These people have to understand that XML though not the best way to talk between machine, may be the best chance to make humans agree on standards. Also we should begin working on another set of standards on converting XML standards to a protocol that will suite computer more than human i.e. Binarization of XML standards This may be our only chance to get out of the endless technological cycles of standardization and take standards to a logical end. PS: Hmm...a prophetic dooms day ending makes it quite entertaining I hope even though it does not make any sense.

Sunday, March 14, 2004

Empire: By Neall ferguson: My thoughts

Just saw the history channel program on this book and got the basic idea about what this book is all about.(should have read this book before writing this article). Seems like it is a how-to book for building empires. So I thought I will add my thought on the subject. Basically this occured while reading the rise of the ottoman empire(one of the longest enduring empire 13/14th century to 19th century). If you look at the concept of genisaries and streach to administrative services we get ICS(Indian Civil Services). Knowingly or unknowingly the britishers developed a process of choosing the people (intellectual, receptive and monetarily successful but not traditional elite), "shock and awe" them culturally in their society, and then use them to govern the local people(since they feel themselves to be different from "barbarian" local population). This feature is quite apparent in most of the novels written by PremChandra (the great Indian novelist during pre-independence India). This seems to the way the British developed their civil service system which was used to rule India. The rulers/viceroy like Curzon and others may think they were working in interest of local population but being so far away from locals that there is no way they could have known what locals' interest were. But something that really worked over time in India's favour was that some of the elites got disillutioned with the "shock and awe" Birtish rulers' culture and so the mechanisms and institutions that were setup to rule Indian seems to have been turned against the British in the end. But one thing that I would really give to British is that they did not try to supress the movement using bullets. This may be due to the foolhardiness or underestimation of movement during initial stages and/or due to shear size of the movement at the later stage which may have resulted in violent rebellion. Besides that the idea that problems of world is not result of British Imperialism is at the very least laughable statement. The british rulers of the colonies without any consideration of local sentiments and possible consiquences drew the administrative/country lines which has been the point of contention for past 60 years. This may be because of the bankrupcy of the Britain after 2nd world war, complete lack of interest of ruling parties in leaving the existing colonies without any thought about the future and local leaders in rush to get "independence" thought that all the problems will be solved automatically when the Britishers will leave. This thinking, that imperialistic powers are needed to help the barbarians develop culture/democracy, is the base to any modern imperialism(since late mid-19th century) which tries to justify the cultural and social disruption of native people as humanitarianism and cultural assimilation. Anyway, this has been going on for century in one form or other and continue till we are part of civilization because of inherent human nature.

Labor market: I think I now get it?

Let me try to build the model for the society in terms of labor. Basically the society/community at any place has a variety of people with different type of built-in skills. So that some of them may be very good in social engineering, while other may be better at analytical skill or at creative skills or at performing repetive jobs. This may be apparent in varing IQ, EQ or other "scientific" scores. Nature makes sure that the entire gamut of the people with varing skills are part of society in order to ensure the survival of species overtime. So any community will over time develop statas based on the skills of the people. Sometimes this startification may be enforced by society itself without regards to person's skills like in case of Hindu caste system or at other times social structure may skew in favour of specific skill set which does not follow the natural distribution skills like in highly industrialized western nations. These divergences in natural skill sets and required skill sets would result in unrest in the labour market which may have its own political and social impacts like protectionism, xenophobia, broadening gap between haves and havenots. But over time the nature adapts to the new skill requirement by the basic strategy of survival of the fitest. I am not sure how quickly nature can adapt. We know that the human typically has long reproduction cycle (female can start reproduction only after 14-18 years). But I think this would be offset by the large population size, improved health care and better communication and transport. The larger population size would ensure that more permutation and combination of genes are possible and improved health care would ensure survival of other wise "naturally unfit" people which have the skills required by society (This is based on assumption that skills is a combination of nature and nurture where nurture should be used to hone the natural skills which may be completely wrong). Besides that the better communication and transport facility means that societies skewed skill requirements can be met by migration of people to requirement or migration of work to places with skill set (example towns, university). Obviously the world and labor market in particular is more complex than the simplistic model I talked above. But it is good enough for me to understand the concept. I am not sure about how accurate is the idea of nature and nurture with nurture helping nature in case of mordern labor market. Given that the skills required are so dynamic that they change every decade the nature can not be expected to cope with it. In such a scenario, will nurture (i.e. incentives, training) help the society in getting the required skills. I donot think so!! There is a limit to how much nurture can help. Human, I guess, has an inherent limit to learning new skills, which degrades with age. So the aging (i.e. societies that have higher % of elderly people) and/or smaller societies are at a inherent disadvantage if they want to continue to be leader in skill business. Instead these society can succeed if they already have existing skills(like political, human and financial management) and power which can be used to control and develop the younger and bigger societies as "skill pool" to be used over time. How is this going to affect the controlling society which will have its own set of people who donot have the necessary skills or can not become part of elite who controls the "skill pool". Over time the difference between haves and havenots is going to increase, resulting in other social implications and may culminate in the social/economic status of have nots reaching the levels of haves in the "skill pool" society. With regards to younger/larger societies, they will continue to be puppets in the hand of the controlling society. Wow, this seems like a real dooms day scenario! Am I that depressed!! anyway, I donot think this conspiracy theory is going to play out, but it seems worst case scenario that, I hope, I am going to laugh at 5 years from now!!

Sunday, February 29, 2004

Products and Frameworks

I have thought about this for some time and in the recent time the discussions at TSS have got me thinking on this. For a start, I belong to that camp (I guess it is extinct) that would prefer hurd over linux (I wish linus had waited for some time). But over time I have realized that each domain, whether it is operating systems, application server or some other product, moves from monolithic systems to framework systems. This process takes its own time. The reason being that before a particular domain starts to grow, people do not have complete understanding about what the framework for that domain should look like. As the monolithic products hit the road, people starts seeing the marks. As the implementation grow, the vendors understand the domain better and can develop the framework that fit the domain. But by that time these monolithic applications have grown so big that vendors do not have any incentive to rewrite their products based on the framework and make the life of customer easier. But there are definitely some products out there which are result of reseach and thus are based on frameworks. At the moment the products like J2EE are at a level where the scope of the frameworks are getting re-defined. I will try to summarize what is going on and how things may proceed. J2EE So far J2EE has grown as a framework which specifies the lifecycle of the application in the container and it also specifies a set of services that may/should be made available(most of the time using the existing specification for that particular domain like Directory, JMS, etc) by the container to application. They did a great job at doing that. But as people started putting together the applications and vendors started developing products to match these specification, they realized that a lot of times people need to be able to configure the container itself for their application to work and the application framework is not good enough. At the same point the commercial products have more to offer in terms of services than what is required by specification which people will like to use. So what is really needed is an Application Server framework(I wish somebody does develop something similar for C - OpenGroup are you listening). In the parallel, people fed up with the complexity and cost of application servers or looking to develop a lighter, flexible and J2EE independent application framework, started developing frameworks for java applications. The frameworks like Avalon(My favourite - why do I always love stuff that most people do not care about), PICO and Spring were result of such requirements. Besides that there were a lot of framework based products were being developed to simplify the life of Java developers like struts, webworks(and similar web frameworks) for Front-end, hibernate for backend and so on. In addition to that advances and maturity in AOP and metadata attribute concepts and implementations were enticing people to utilize them in their application Now people are looking to utilize these various components to develop J2EE applications. But the framework was never meant to address the problems that people wanted to solve. So how should we proceed from here. Basically next generation framework will have to be for Application Servers instead of for applications. How would such a framework look like. Basically it may look something like Avalon ;-) Basically idea being that each of the services like Servlet-JSP/Front-end applications, JMS/Asynchronous Messaging, JNDI/Directory Discovery service, IIOP-Socket-/Synchronous Messaging-RPC, Scheduler/Time management, Transactions, Cache/Replication/clustering, EJB/Java Application with business logic, JDBC-Database Manager, Security Services, are themselves a service in the Application Server. Any of these services can be used by other services or application. So for example the Database Manger service may use cache service to provide better performance. Now some of the components/services like EJB, front-end applications can themselves be containers which host the applications.written by developers. These containers can be standard JSP or enhanced containers like strut or webwork or it may support AOP or other properietory thing that people want it to support. But it is important to define the lifecycle of these provider and especially the management/configuration interface(may be JMX is good enough). This would allow users to use a standard way to configure these containers for their applications and not bother with properietory files like weblogic.xml. At the same time J2EE should get out of the way in defining which service should be part of the specification. Any service that follows the java specification should be allowed to be part of J2EE specification as long as it is defined by one of JCPs. This will allow vendors to innovate and respond quickly to market requirements instead of waiting for the J2EE to pass it. So if tommorow vendor see that rules engine is in demand they should be able to ship it without breaking J2EE requirements. Another important aspect of the system is enhacement of these containers themselves by the application developers. With AOP showing the way, it may be prudent to design the specifications for generic containers that are extensible using various methods like configuration files, AOP or a properietory method. I think if we can lead J2EE along this path we will have more flexible system. Some may raise the question about how the application server companies will make money in such a system. I am not sure that we should worry about it. The basic application server vendors can continue to make money by shipping the complete product that provides a default implementation of all the services because there would be products developed that will have dependency over other services and even if you replace one component with new product, users will need all the other services to function well. So I do not see vendors being threatened by this system and at the same time it will allow the experts in particular fields to develop components that can easily be integrated into the system without developing properietory wrappers around them. This is more important for services like cache, transaction manager, security which cut through the all the services and the framework itself. In order for this to continue, another important component is JCP. Basically the JCP when defining the services/API should take into account the management aspect of it and develop the schema for the same. So each of the services should take into account that service providers will be developing them and will need to expose JMX interface that will allow external customers to tune these APIs at initialization or at runtime. Eventhough some of the services take that into account, this information is missing from most of the other places and results in chaos when the systems hit the street.

Friday, February 06, 2004

User Pain Lifecycle and an approach to solving the problem

Basically why does a company buy a product even after building it in-house? I can think of some technical reasons (I am sure there a lot of non-technical reasons) -
  1. Vendor has more subject matter expertise – A simple idea that if vendor has designed and developed a product, then vendor would have designed solution in an environment independent and framework model which can be used to address most of the use case out of box and at the same time can be extended to cover all the use cases.
  2. Vendor will have more SME – As the time progresses, the vendor integrates products in more diversified environment and would have had to enhance the system for different use cases and so the when company will run into those use cases vendor would be able to provide solutions.
  3. Vendor have dedicated resources and can spend more time, money and effort to make sure that the product works.
But given my limited experiences with new products, it seems that most of the time none of this is true.

Pain Life Cycle

Typically most of the companies have “pain” to start with and then some one comes up with an idea to solve the problem. So a small system is built to solve the pain which slowly starts getting accepted and enhanced. As the time progresses the small product grows to become a large product which fulfills most of the internal users need.

Depending on how good the architecture and coding team was the IT departments ends with a fine product or a blob of code that works but no body understand.

While this thing was going on, somebody (The Vendor) noticed that this requirement exists in a lot of places (the start of the hype) and so tries to· start building a product to address client needs (please note this may not apply to a few companies that are started by people from academic background). Marketing/Sales wants to get the the first few versions of the product to the market as soon as possible resulting in a faulty architecture, sloppy code and limited testing. This product has very basic functionality, is defective and is architecturally weak.

·At this point the media and analysts have started to pick on the hype and added words like “paradigm shift” and “next generation” to these requirement. At the same time company is in deep pain from managing the in-house developed blob of code.

The Company buys the product after some evaluations and pilots with no idea how the requirements are going to change over time.·

So the company ends up with a product that does not provide all the functionality that in-house product provided, does not integrate well in the environment of the company, is defective/unstable/bad performance and non-extensible.

Why do companies do that?

I don't know.

Another approach

But what if the company took a different approach -

  1. Consortium of Vertical Industry users - So form a consortium of IT professionals who will come up with a set of use cases which are common to all the people. So instead of the vendors developing fuzzy use cases and even fuzzier standards around them, each vertical segment should have the requirements laid out. The consortium may have its own lab or donated labs where the products can be verified to be requirement compliant.
  2. Selectively "Open-design" their internal product - Basically this is based on the idea that the products developed internally are superior to the first few versions of vendor products. So in order to cut down the time and provide vendors with roadmap, the customers can open-design(could not think of any other word) the high level design and development information which can be made either public or provided only to legitimate members. This will help vendors and other open-source developments to get a handle on "what client wants".
  3. Share the experiences with the consortium - This is another important idea that must be used to ensure that the knowledge is not wasted and people can learn from the mistakes of other.4. Vendors Interface - Force vendors to implement the common features required by most of the members of consortium and make them follow a framework architecture.

Most of these thoughts are not something new and I guess people have not seen the benefits of sharing out weight the benefit of keeping their internal information secret. But till we don’t have a legal memory erasing device or a single giant enterprise, people will continue to change jobs and take these "internals secret" to the competitors. So why not do this sharing in a formal way especially when it is going to help achieve companies to·improve their core·functionality·and not·be obsessed with·IT.