In a posting on Meeraj Kunnumpurath's weblog , I read that the Java Servlet specification 2.5 (still in maintaince version) will have the ability to inject dependencies to classes whose lifecycle are maintained by the container. This means that you can just inject and use resources, like EJBs and DataSources, in the same way as you can do now in the EJB3.o specification (still in beta-version).
Currently, I'm experimenting with Oracle's EJB3.o implemention and I have to say my first impressions are quite hopefull. Especially the simple way of using resources in an EJB, like an EntityManager or another EJB, is a feature I really like as a J2ee developer. I think dependency Injection, and IoC in general, is really a great feature that increases the flexibillity and development speed of a Web Application Framework (WAF)..For example look at how popular Spring is these days...
Saturday, December 17, 2005
Sunday, December 11, 2005
Downloading with 150 gigabits per second..
Today, I read on Fermilab the news that the California Institute of Technology as won SC|05 Bandwidth Challenge in Seatlte last November. The team of high energy physicists, computer scientists and network engineers led by the California Institute of Technology transferred physics data at a rate of over 150 gigabits per second--equivalent to downloading over 130 DVD movies in one minute.
The entry by the team is part of the preparation for a new particle accelerator, called the Large Hadron Collider, which will begin operating in 2007 at the CERN in Geneva, Switzerland. Data provided by the LHC shall be accessed by thousands of scientist around the world to help finding answers at questions concerning (e.g. ) the universe. The processing, distribution and analysis of the data will be completed using high-speed optical networks, software to monitor and manage the data flows across the networks, and grid computing.
I wonder when I as a internet user will see my bandwith increased due to these kinds of developments in the world of network technology..;)
The entry by the team is part of the preparation for a new particle accelerator, called the Large Hadron Collider, which will begin operating in 2007 at the CERN in Geneva, Switzerland. Data provided by the LHC shall be accessed by thousands of scientist around the world to help finding answers at questions concerning (e.g. ) the universe. The processing, distribution and analysis of the data will be completed using high-speed optical networks, software to monitor and manage the data flows across the networks, and grid computing.
I wonder when I as a internet user will see my bandwith increased due to these kinds of developments in the world of network technology..;)
Subscribe to:
Posts (Atom)