There is a balance between delivering use cases to the market and maintaining overall software quality in an Agile Software Development Project. It's extremely easy to get "trapped" into the "tunnel vision" of providing user facing features as fast as possible and quickly encounter architectural deficiencies and technical debt that impacts the overall quality and performance of the software.
Agile does not mean NO architectural design! Many teams just learning agile have difficulty understanding how design "fits" into the use case or user story methodology.
Here at ERDAS, we utilize planned 'spikes' and design sessions and plan them in iterations before implementation of the the user/system interaction to ensure that the resulting imlpementation soundly builds upon our architecture.
A SPIKE is a "technology" discovery process. It can be a research project into technologies or algorithms, an evaluation/benchmark or prototype of technologies to find a "best fit" or a discovery process of existing algorithms and archtecture to provide man day estimates or "Level of Effort" to get some use case completed. We effectively use SPIKES to address the "unknown" or "uncertain", dedicate time to make it known and to determine how long it will take to satisfy a use case and always report the results of every spike in the Iteration Review.
We also enforce a "time capped" rule on spikes. This rule essentially allocates a fixed amount of time that it will take to "discover" what we want to know. At run time if a blocking issue is encountered, we can always increase the duration of the spike, but we very seldomly do so. Time capping the spike really enables detailed planning, ensuring we avoid "creep" in a discovery process and stay on schedule.
Thursday, May 28, 2009
OpenGeo Team is "faster and better" than anyone else in the world???
I really get a kick out of the self provocative proclamation that the OpenGeo Team is "...faster and better than anyone else in the world..." at solving geospatial problems on the OpenGeo Team page.
I recommend a "geospatial" Academic Decathalon!
ERDAS has been solving "real world" National Mapping Agency geospatial workflows for decades now. ..so how are you 'faster' and or 'better' than the world class geospatial scientists, remote sensing scientist and developers that exist at ERDAS?
Should we measure this based on software revenue? Possibly an 'apples to apples' comparison of products and satisfied use cases? "Challenge" each team with a use case to satisfy (FULLY!!)? How about number of supported sensors and formats? Or what about a third party review of resumes?
Thats just a rediculous statement guys...
I recommend a "geospatial" Academic Decathalon!
ERDAS has been solving "real world" National Mapping Agency geospatial workflows for decades now. ..so how are you 'faster' and or 'better' than the world class geospatial scientists, remote sensing scientist and developers that exist at ERDAS?
Should we measure this based on software revenue? Possibly an 'apples to apples' comparison of products and satisfied use cases? "Challenge" each team with a use case to satisfy (FULLY!!)? How about number of supported sensors and formats? Or what about a third party review of resumes?
Thats just a rediculous statement guys...
Friday, May 22, 2009
ERDAS APOLLO Reference Sites
With a new enterprise product suite on the market, it's extremely important to have reference sites and to determine "who" is using the ERDAS APOLLO and "what" they are doing with it.
Here's a small list of some of the ERDAS APOLLO customers that have Press Releases:
British Transport Police Preparing for 2012 Olympics with ERDAS APOLLO
Cyprus University of Technology Implementing ERDAS APOLLO
ERDAS APOLLO 2009 Selected by China’s National Environmental Protection Ministry
ERDAS Solutions Used for Tourism 3D Spatial Analysis
Saxon Forestry in Germany Implementing ERDAS APOLLO
Thursday, May 21, 2009
The Price of "FREE" Open Source Software has really become Expensive!!
I was looking at the OpenGeo Version Matrix and the price to "buy in" to the open source geospaital software has really become crazy! It appears the line between capitalist and geospatial philanthropist has really become blurred. It's more expensive to buy into open source than to purchase COTS software today!
$70,000 for 300 hours of service!!!!!!!!!!! OMG!
I run into so many clients that are "hamstrung" on open source solutions that are being funnelled into a bottomless money pit with open source. No doubt, the "hook" to allure people into the evaluation stage is there with the "free" pitch, but the REALITY of what it will take to really meet requirements smacks you in the face immediately.
The business model of "Try it...figure out what you really want...then pay me 70K" open source model is a bit crazy.
Always remember, you buy into it, YOU MAINTAIN it for the rest of your life. OUCH!
The market is begging for a vendor to pick up the ball here...luckily, ERDAS is HERE!
Give the "out of the box" SDI that works, has a WORLD CLASS development, support and product management team supporting the project with real world PRODUCTIZED features and evaluate the difference for yourself!
The ERDAS APOLLO!!!
Can somebody calculate an ROI for me immediately!
$70,000 for 300 hours of service!!!!!!!!!!! OMG!
I run into so many clients that are "hamstrung" on open source solutions that are being funnelled into a bottomless money pit with open source. No doubt, the "hook" to allure people into the evaluation stage is there with the "free" pitch, but the REALITY of what it will take to really meet requirements smacks you in the face immediately.
The business model of "Try it...figure out what you really want...then pay me 70K" open source model is a bit crazy.
Always remember, you buy into it, YOU MAINTAIN it for the rest of your life. OUCH!
The market is begging for a vendor to pick up the ball here...luckily, ERDAS is HERE!
Give the "out of the box" SDI that works, has a WORLD CLASS development, support and product management team supporting the project with real world PRODUCTIZED features and evaluate the difference for yourself!
The ERDAS APOLLO!!!
Can somebody calculate an ROI for me immediately!
Tuesday, May 19, 2009
Cherokee County, Georgia...the MOST MAPPED SPOT ON EARTH??
For the ERDAS Enterprise Products Public Demo Site, we've been very fortunate to leverage our data vendor business partners and the Cherokee County GIS Team to collect lots of vector, terrain and imagery data for this area of interest and serve this data through our enterprise products. For one, ERDAS global headquarters is in Norcross, Georgia which makes it a close proximity to Cherokee County and second, it's just a pretty nice place (thats went through massive change over the past 10 years).
Lets take an inventory of the data that we've collected:
1. LANDSAT 1, 4-5 and 7 scenes from 1973 - 2008 (multispectral and panchromatic)
2. Digital Ortho Quarter Quads from 1999
3. Airborne 2006 high resolution ortho imagery
4. IKONOS imagery from 2000-2008 (multispectral and panchromatic)
5. USGS Digital raster graphics at 1:24k, 100k and 250K
6. SRTM DTED
7. SPOT scenes from 1999-2008 (multispectral and panchromatic)
8. National Land Cover Dataset from 1992 and 2001
9. 2008 Vectors of Roads, Parcels, Land Lots, Zoning, Buildings, etc, etc from Cherokee County Georgia
on and on and on and on....
Note that all of the imagery and terrain is being served from a SINGLE Web Service endpoint (of course you are only gaining access to the public layers by clicking on this).
The vectors are being hosted from an Oracle 11g Database with Spatial Cartridge with no proprietary middleware required or proprietary SDK or proprietary data model, JUST Oracle spatial please!
All of this data over all of these timeframes in one location made me ponder, is Cherokee County Georgia now the MOST MAPPED AREA IN THE WORLD!!!
We are using our business relationships to collect even more data over the area so stay tuned to and see how many "sensors" we can collect over a single area!!
There is some really EXCITING FEATURES coming in the ERDAS APOLLO 2010 release this September so GET READY to see Cherokee County Georgia like you've never seen it before!!!
Lets take an inventory of the data that we've collected:
1. LANDSAT 1, 4-5 and 7 scenes from 1973 - 2008 (multispectral and panchromatic)
2. Digital Ortho Quarter Quads from 1999
3. Airborne 2006 high resolution ortho imagery
4. IKONOS imagery from 2000-2008 (multispectral and panchromatic)
5. USGS Digital raster graphics at 1:24k, 100k and 250K
6. SRTM DTED
7. SPOT scenes from 1999-2008 (multispectral and panchromatic)
8. National Land Cover Dataset from 1992 and 2001
9. 2008 Vectors of Roads, Parcels, Land Lots, Zoning, Buildings, etc, etc from Cherokee County Georgia
on and on and on and on....
Note that all of the imagery and terrain is being served from a SINGLE Web Service endpoint (of course you are only gaining access to the public layers by clicking on this).
The vectors are being hosted from an Oracle 11g Database with Spatial Cartridge with no proprietary middleware required or proprietary SDK or proprietary data model, JUST Oracle spatial please!
All of this data over all of these timeframes in one location made me ponder, is Cherokee County Georgia now the MOST MAPPED AREA IN THE WORLD!!!
We are using our business relationships to collect even more data over the area so stay tuned to and see how many "sensors" we can collect over a single area!!
There is some really EXCITING FEATURES coming in the ERDAS APOLLO 2010 release this September so GET READY to see Cherokee County Georgia like you've never seen it before!!!
Friday, May 15, 2009
Is the Geospatial World devoid of Performance Benchmarks??
Why are performance benchmarks so elusive for enterprise geospatial software?? Every Request for Proposals for enterprise geospatial software requires some level of "performance" to be reported. Commonly RPF's request the expected number of concurrent users, throughput and "load" the system can handle and indirectly a hardware set to be recommend based on a purported number of users that will be using the system.
Here at ERDAS, we invested in HP LoadRunner to design an enterprise performance testing system that is the best I've ever seen in my history with enterprise software. Unlike many vendors, we don't report "predictive" numbers, we produce and report ACTUAL performance numbers on real world enterpise systems under design! We of course use the testing setup internally to determine 'things to improve' and the impact of individual features (i.e. portrayal or reprojection) vs. a known baseline. Just to be forwarned, the setup was not cheap and it was also not easy to figure out how to properly implement, but the stability, flexibility, repeatability of test "scenarios" and RESULTS produced are AWSOME!
All I know is that I "FROTH" at the opportunity to stand APOLLO up to any system on the market today! We've "handily" beat out several competitors in "head to head" evaluations and always meet our documented performance results. I attribute this to our investment in performance testing setups and required performance test scenarios to pass before every release. The testing setup has proven INVALUABLE in succinctly diagnosing performance issues, ensuring our performance at release time is to our standards and report to customers the performance they should expect...and MEET that expectation!
Here at ERDAS, we invested in HP LoadRunner to design an enterprise performance testing system that is the best I've ever seen in my history with enterprise software. Unlike many vendors, we don't report "predictive" numbers, we produce and report ACTUAL performance numbers on real world enterpise systems under design! We of course use the testing setup internally to determine 'things to improve' and the impact of individual features (i.e. portrayal or reprojection) vs. a known baseline. Just to be forwarned, the setup was not cheap and it was also not easy to figure out how to properly implement, but the stability, flexibility, repeatability of test "scenarios" and RESULTS produced are AWSOME!
All I know is that I "FROTH" at the opportunity to stand APOLLO up to any system on the market today! We've "handily" beat out several competitors in "head to head" evaluations and always meet our documented performance results. I attribute this to our investment in performance testing setups and required performance test scenarios to pass before every release. The testing setup has proven INVALUABLE in succinctly diagnosing performance issues, ensuring our performance at release time is to our standards and report to customers the performance they should expect...and MEET that expectation!
Wednesday, May 13, 2009
ESRI vs. OGC Community
ESRI is really pushing hard on proliferating their own PROPRIETARY services with their 9.3.x Server offering and recommending this over standards based interoperable web services to the GIS community "at large".
I am also PUBLICLY stating that thier support for the OGC services (especially CONSUMING them in their clients) is VERY WEAK functionally as a feature set and the performance is very, very poor. The ERDAS APOLLO Image Manager Web Client is such a better user experience and faster at consuming OGC services than ArcMap!!
Lets put ourselves in their shoe's and dwell on why this would be??
ESRI currently holds the largest marketshare in the GIS domain. They have every intention to keep that marketshare. To the market leader, making the OGC services actually work equivelently to thier proprietary services does have the possibility of marginalizing and commodotizing feature sets in the GIS market leaving "opportunity" to those who are supporting the interoperable services. Forcing the customer to "have" to use proprietary services and SDK's to meet thier use case is also in ESRI's interest as it requires vendor lock-in on the server and client side. It's quite easy to say that the OGC services aren't "rich" enough to provide the use cases that clients need when thier only experience with it is extremely limited and the performance is very slow (as experienced in thier software today). They also have no interest in a "governing" body controlling technology decisions and/or application profiles on the technical side.
OGC Services on the other hand need to provide the user experience and the PERFORMANCE that proprietary services do. In my opinion, this can only be provided by the geospatial vendors. The open-source project don't have the wealth of domain experience, existing codebase and market experience to do this. They of course will provide a user experience, but at a very poor performance.
Entre...VENDORS SUPPORTING THE STANDARDS AND DOING IT RIGHT! ERDAS has really supported the OGC standards in an extremely MEANINGUL and HIGH PERFORMANCE manner. We are CITE certified OGC services and provide "under the hood" the depth and richness of format support, sensor model support, workflow and an out of the box end user experience in a single product that is expected of a commercial vendor.
If you really want to see the OGC services FLY on TERRABYTES worth of heterogenous data with real world use cases...the APOLLO Enterprise Suite is what your looking for.
I am also PUBLICLY stating that thier support for the OGC services (especially CONSUMING them in their clients) is VERY WEAK functionally as a feature set and the performance is very, very poor. The ERDAS APOLLO Image Manager Web Client is such a better user experience and faster at consuming OGC services than ArcMap!!
Lets put ourselves in their shoe's and dwell on why this would be??
ESRI currently holds the largest marketshare in the GIS domain. They have every intention to keep that marketshare. To the market leader, making the OGC services actually work equivelently to thier proprietary services does have the possibility of marginalizing and commodotizing feature sets in the GIS market leaving "opportunity" to those who are supporting the interoperable services. Forcing the customer to "have" to use proprietary services and SDK's to meet thier use case is also in ESRI's interest as it requires vendor lock-in on the server and client side. It's quite easy to say that the OGC services aren't "rich" enough to provide the use cases that clients need when thier only experience with it is extremely limited and the performance is very slow (as experienced in thier software today). They also have no interest in a "governing" body controlling technology decisions and/or application profiles on the technical side.
OGC Services on the other hand need to provide the user experience and the PERFORMANCE that proprietary services do. In my opinion, this can only be provided by the geospatial vendors. The open-source project don't have the wealth of domain experience, existing codebase and market experience to do this. They of course will provide a user experience, but at a very poor performance.
Entre...VENDORS SUPPORTING THE STANDARDS AND DOING IT RIGHT! ERDAS has really supported the OGC standards in an extremely MEANINGUL and HIGH PERFORMANCE manner. We are CITE certified OGC services and provide "under the hood" the depth and richness of format support, sensor model support, workflow and an out of the box end user experience in a single product that is expected of a commercial vendor.
If you really want to see the OGC services FLY on TERRABYTES worth of heterogenous data with real world use cases...the APOLLO Enterprise Suite is what your looking for.
Tuesday, May 12, 2009
The ESRI Geodatabase Proprietary cluster
I usually don't complain in general, but this time I've had it up to my eyeballs with the inability to work with the ESRI geodatabase without using their proprietary SDK's. I've developed with ArcObjects for over a decade now so it's not a matter of "complexity", it's simply an issue of total lack of interoperability!
The "marketecture" on thier website speaks of interoperability and IT standards yet they don't allow anybody to access the data that they store in their PROPRIETARY storage format...say one thing, do another.
Don't get me wrong, I'm a huge fan of the FEATURES of the geodatabase, but I've had it with having to use ArcObjects to work with what should simply be free flowing GI.
So there is supposed to be a published specification for the "file" geodatabase in the 9.4 release. Great...but what about the DB persisted "enterprise" geodatabase? It must only be "simple" feature specification as all the behavior of objects is in the application tier?? I'm looking forward to implementing the real "simple feature specification" on top of whatever specification they provide....ughhh.
The "marketecture" should read, "We are totally interoperable...with ourselves only"!!!! (note the very small typeset caveat disclaimer said that under my breath reality check).
The "marketecture" on thier website speaks of interoperability and IT standards yet they don't allow anybody to access the data that they store in their PROPRIETARY storage format...say one thing, do another.
Don't get me wrong, I'm a huge fan of the FEATURES of the geodatabase, but I've had it with having to use ArcObjects to work with what should simply be free flowing GI.
So there is supposed to be a published specification for the "file" geodatabase in the 9.4 release. Great...but what about the DB persisted "enterprise" geodatabase? It must only be "simple" feature specification as all the behavior of objects is in the application tier?? I'm looking forward to implementing the real "simple feature specification" on top of whatever specification they provide....ughhh.
The "marketecture" should read, "We are totally interoperable...with ourselves only"!!!! (note the very small typeset caveat disclaimer said that under my breath reality check).
Subscribe to:
Posts (Atom)