Have you ever burned your hands by some new and immature technology?

吃可爱长大的小学妹 提交于 2019-12-03 08:06:06

问题


I often hear people saying you shouldn't rush into adopting new technologies until they have become stable, tried and tested. There is even a joke on how it takes 3 versions to get it right. This might be the voice of real-life experience, but at least sometimes such posture is the result of complacency, resistance to change and effort necessary to learn new skills.

In my opinion however, it is crucial for success in software industry to keep the pace with the innovation. While big companies have whole departments dedicated to R&D, in smaller companies it's the development teams that have to keep up. Embark on the new technology even before it is officially out - this will give you some head-start and will help you keep up with the rest.

Here is the strategy that I try to follow whenever possible:

  • Be aggressive in adopting new technologies
  • Use early betas for experimenting and prototypes and RCs for development
  • Address any last minute changes to the product when official release of technology you adopted early comes out
  • Do not rely on some obscure open source project with 0 activity
  • Be sure to study but take with a grain of salt official product roadmap.

So far, I never paid the price of being too zealous to jump on some new technology train, still I reaped the benefits. I wonder if this is just a coincidence or maybe being early-adopter is not so dangerous after all?

More than inviting to a discussion on the subject early-adoption, since such issue is surely to be contentious and subjective, I would like to hear real-life experiences where adopting early new technology proved to be a serious mistake and dire price had to be paid.


回答1:


I'm currently in the process of getting burned by Microsoft Office Word 2007's CustomXML support.

CustomXML allows the document to have custom defined elements that can model business data etc. For example, you could define an XSD with your custom elements, associate it with a docx file, then generate the placeholders as CustomXML tags and navigate/modify the documents using C# (or other .NET languages) and the OpenXML SDK. The benefit of OpenXML is that it decouples the need to have Office installed on a server machine for automation purposes and is an alternative to purchasing 3rd party libraries.

In short there was a lawsuit regarding Word 2007's ability to open documents with custom defined XML. From this article:

On August 11th, the company received an Office Word sales injunction ...

"This injunction applies only to copies of Microsoft Word 2007 and Microsoft Office 2007 sold in the U.S. on or after the injunction date of January 11, 2010. Copies of these products sold before this date are not affected."

Microsoft's response is to remove support for CustomXML from future versions of Word and is releasing a patch that would entirely remove this capability. Here is the link to the official update. According to this Microsoft OEM Partner Center site:

The following patch is required for the United States. The patch will work with all Office 2007 languages.

After this patch is installed, Word will no longer read the Custom XML elements contained within DOCX, DOCM, or XML files. These files will continue to open, but any Custom XML elements will be removed. The ability to handle custom XML markup is typically used in association with automated server based processing of Word documents. Custom XML is not typically used by most end users of Word.

I imagine a tiny percentage of end users and developers make use of it, so I consider that last sentence to be accurate. The problem is there's currently no word (no pun intended) on how to move forward for projects that did utilize this technology. CustomXML is the cornerstone of a large project I'm currently working on. The impact of this decision is not positive and it effectively prevents any forward compatibility as there's no equivalent alternative approach that maintains the structure that CustomXML provided.

Some of my coworkers and I have a wealth of knowledge on the topic... I guess it's good we didn't get around to writing blog posts about it as we had planned :) We've accomplished some pretty impressive feats with this and the VSTO, but this news is disappointing.

If anyone's interested in this topic here are some articles to check out:

ZDNet articles:

  • Microsoft loses its appeal in $200-million-plus Custom XML patent infringement case
  • Microsoft removes Custom XML features from Office 2007

BNet articles:

  • Microsoft Moves Fast, Already Has Custom XML Patch for Word
  • Microsoft Might Get Advantage or Pain from Order To Not Sell Word

Softpedia articles:

  • Microsoft Can No Longer Sell Office Word 2010, 2007 or 2003
  • Microsoft Dodges Office Word Sales Injunction - At least temporarily
  • New Office 2007 Copies Coming after Custom XML Appeal Was Denied to Microsoft - Starting in 2010

EDIT: added link to the official update.




回答2:


I can write a pretty good Java Applet. All technologies will fall by the wayside eventually, but this one had a very sharp rise and fall.




回答3:


Several years ago, we made heavy use of the new SQL Server 2005 feature called Notification Services. To our dismay, that has been discontinued in SQL Server 2008. This was a serious problem, caused the software architect to question all new Microsoft technologies.

Here's some detail and some more and some more

There have also been issues with Microsoft's Entity Framework.




回答4:


Anybody else remember OpenDoc, Apple's idea for how all new Mac applications would be written? Didn't think so.




回答5:


Scala.

It looks great on paper, so I wrote a project with it while making sure to keep my Scala version up-to-date. The version number (2.7.x) and its years in development made me feel relatively secure doing that.

Well, I made a mistake. The problem? Serious lack of documentation and code samples, as well as ever-changing class library (twice during my work, previously-working code started getting "deprecated" warnings... and I'm talking over the span of a few months and similar version numbers).

I can't say I lost much (this was a private project) but I will not touch Scala in the near future. I still think it's a very nice, promising language, though.




回答6:


When I was 10, my father tried to play a New Year song for me on a brand new Elektronika BK-0010-01.

Needless to say that the synthesizer failed to load from the tape and there was no song until the neighbour came with a guitar.




回答7:


Yes I have! With JSF 1.0! It seemed like Sun didn't reviewed it well before releasing it out.

We've been trying to make things work but in a while we just discovered that our errors were caused by JSF bugs and we had to use workarounds. It was not until JSF 1.1 and the use of myfaces-tomahawk implementations that the project started having some speed.




回答8:


QBASIC never really took off. I spent years learning it too.

OK, to be fair it was my first language and a good way to learn. And it was later replaced by Visual Basic, then VB.NET. So it wasn't a complete waste of my time. ;)

Most of the time even if a language doesn't "take off" exactly, it's still a good learning experience that can be applied to something else.




回答9:


Delphi.NET. Still have a tic when I hear that!




回答10:


The worst is when you get 80% through a project using a new product and hit a showstopper bug.

Back in the mid-80s my boss suggested I try a new dBase alternative called KnowledgeMan. It was far along when I realized that some crucial bugs I thought were mine were actually theirs. The whole thing had to be redone from scratch; it cost me my job.




回答11:


Yes. I'm a Lisp programmer: everything looks new and immature to me. :-)




回答12:


AzMan (Microsoft Authorization Manager)

We started using this on a public web site/web app, enticed by dreams of single-sign-on and claims of being able to "leverage your existing infrastructure" or whatever the marketing speak now says. A drop-in solution for ASP.NET that sysadmins could manage without having to develop any tools or write any code at all. It was win-win, right?

We learned several things as a result of our decision, none of which we wanted to learn:

  • Active Directory itself is not a very good choice for an authentication mechanism servicing a public web site. Not that it isn't capable - it's quite capable, but it's like hiring a Ph.D to write a "Hello World" app. It's overqualified, it does way more than you could ever need in such a context, it's much more difficult to work with than a plain old SQL table, and requires a lot more maintenance.

  • AzMan is slow. Very, very slow. The role provider has to maintain a cache, which should tell you just what kind of performance we're talking about. I never did fully understand why it was so slow, but I imagine it has something to do with the hornet's nest of COM and network protocols it depends on.

  • A cache (see above) can be a very dangerous thing when you have little to no control over it. When we added new users manually (i.e. through an administrative application as opposed to the site itself), those users would end up with a "not authorized" screen until the cache expired and they logged out. Sometimes this would even happen to users who self-registered online; we never did find out why.

  • The tools were horrible. Take a brief look at the AzMan console if you don't believe me, or read some of the documentation if you really want a headache. Why should anything be so complicated?

  • It was flaky. A lot of times the provider would just stop working, spitting out cryptic COM errors (a different one each time!) and we had to restart IIS or even the entire web server to get it to cooperate again. We also had a domain trust set up - because obviously we didn't want 50,000 public user accounts on our internal corporate domain - only problem was, administrators had to log in to administrative accounts on the secondary domain to manage roles because the console would fail in mysterious ways if you tried to use it from the primary (even as an Enterprise Admin with Domain Admin rights on the secondary domain).

  • Support was practically nonexistent. If you use the basic SQL Server role provider (which we don't, but just as an example), there are 10 million tutorials and you can Google for any error message or ask any question on any forum. Whenever something went wrong with AzMan or we wanted to do something new, it was a constant struggle to find good information.

  • Code integration was awkward. You had to go through a bunch of messy COM layers and the interface sucked. If I recall correctly, there was no way to just do a simple authorization check - you had to download the entire app/role registry. This was a long time ago though, so my memory might be foggy on that aspect.

Eventually we couldn't take it any longer and decided to rip out the entire system for a homegrown one based on a couple of SQL Server tables, which is probably what we should have done from the get-go. Migration was painful (see the two points above), but we got it done, and never looked back.




回答13:


Unfortuantly it cuts both ways. When we first started developing a large web-based app for on Windows, .NET had come out in beta - with a final release of .NET 1.0 not long away.

However because it was new, and we didn't know what was going to happen, how popular it would be, and whether MS would drop it six months later. So we stuck with the tried-and-tested VB6.

We're still having to maintain that VB6 legacy, and it's been restrictive for a while. Although it's not listed anywhere, we're getting paranoid that support for the VB runtime is going to be withdrawn at a given version of Windows.

That said, going the .NET route may have had its own pain: 1.0, 1.1 and 2.0 came out fairly quickly after each other, each with (some) incompatibilites with the previous version. Thus having to migrate .NET platform would have carried a different risk. Less or more? Can't answer that one having not experienced it :-)

In the end, you can be damned if you do and damned if you don't. If someone can read the entrails to determine whether a given technology is going to succeed at any one time, then they shouldn't have a job in Software, and should probably go into hedge fund management instead, make loads of cash and retire early :-)




回答14:


Hell yes

I'm currently feeling the pain of being an early adopter of Fortran 2003 :-)

Mark




回答15:


Mozilla XULRunner.

It was Adobe AIR, before there was AIR. We wrote our Human Resources Management system using it. At the time XULrunner was "just about" to released as the underling engine for FireFox, so we expected that all we would have to do is make sure our users had FireFox installed.

2 years into the project, and right before deployment, a new XULrunner came out that completely broke all of our code, and a Firefox deployment was nowhere in sight. We ended up deploying on our older version with a dedicated desktop installer and have been using it ever since, without the benefit of security or performance updates because we would have to re-write too much code to be compatible. Despite that it has been a very successful project with our customers.

We're now re-writing the app to run on Ext which is the new hot thing for us but seems to have more community support, and offers commercial support if we really get stuck on something.




回答16:


Java

I was very eager to start working on it in 1996 and used it for several projects. But for web development I always preferred Perl and these days PHP. GUI development I ended up mostly using .NET. For the few command line programs that cannot be handled by scripting I prefer to use Perl, Python or even for that PHP.

Few of the Java programs I wrote were used over long periods of time, while some of my pre-java applications are still in use.

I think the main reason for this is that it always took longer to develop something in Java than using another programming language: so the resulting applications contained less features and were easier to replace.

As speed of development is usually an issue for my customers Java tends to end up as the second choice.




回答17:


I'm incredibly close to the flame everyday by being an early MonoTouch adopter. I never know what's going to happen next with this framework. But to its credit, the Novell team is standing by with fire extinguishers just about 24/7 :)




回答18:


64 bit Carbon APIs on Mac OS X: I didn't get burnt personally on this, but I have a friend working for a big software company that spent a year converting almost all of their code to use the 64 bit Carbon APIs only to find out at WWDC that those APIs were no longer going to be made available.




回答19:


Anyone noticed the trend here? The majority of technologies here were created and canceled or modified by microsoft...

I also have been burned by microsoft with changes made to the entity framework.




回答20:


For lacking market presence:

  • Google's Go
    • Poor toolchain, lacks integration with popular compilers and C.
  • Python3000
    • Lacks must-have features: Iterators, cleaned up internals and tidier interfaces are nice for us hardcore users, but the majority want performance, and this hasn't been delivered.
  • C++0x
  • C99
    • It's been 12 years, and no mainstream compilers fully implement this. Popular projects and niche architectures remain on C89 to be safe.

For poor quality:

  • Windows Vista
    • 'Nuff said.
  • Perforce
  • C++

For lagging behind upstream:

  • PyGTK on Windows
  • MSVC C support

Note that my listing these technologies in no way suggests that they're no good, I'm a huge fan of all of these (except the poor quality ones). My opinion on being burned by these technologies is first hand (usually me trying to push them as replacements for existing technology, or simply running into barriers after a significant investment has already been made.




回答21:


In my opinion however, it is crucial for success in software industry to keep the pace with the innovation.

This doesn't answer your specific question, but, there's a book called Crossing the Chasm that might interest you.




回答22:


I was once forced to used witango, but I'm getting over it.




回答23:


For me Delphi's IntraWeb was it.




回答24:


Not programming but still a newer technology blunder - I nearly lost a nipple to my first mini-ATX build, moral of that story is to never lean over a case while trying to forcefully close it when it gets jammed...




回答25:


I could count many of them. The one that still hurts when I think about it is WLPI (an old BEA workflow product). Never worked out and the vendor abandoned it. Sigh ...

Anyway, I would say keeping up with the latest (knowing what is there, considering it) is very worthwhile, but only live on the cutting edge if:

  1. You are prepared to get cut and bleed (money/time/resources)
  2. It provides an important strategic advantage/competitiveness.

A good example for this is AJAX. It is now mature enough that every new website should be doing it unless they have a compelling reason not to, but when it was first becoming possible, a website built on it would have been very expensive compared to the traditional alternative.

Some websites need the latest look and feel to stay competitive, even to the point where the features of the site themselves are secondary and they needed to be AJAX early adapters. Others do not. Know who which one you are and act accordingly.




回答26:


Blackbird.

A wonderful development environment for creating interactive content for MSN.




回答27:


True Basic

In the mid-1980s we were looking for a development platform that would work on the various DOS implementations and not be as "bit-twiddling" a language as C was.

We found True Basic, advertised as having been created by the original creators of BASIC back in 1964. Here was a language that 'compiled' down to p-code. Not only would it run on DOS machines, it ran on GEM (Atari-ST) and Amiga boxes.

It had add-ons much like we were used to having with development environments on the VAX/VMS machines we used. Things like Forms packages, an "ISAM" add-on (before the days of callable databases on PCs), etc.

Unfortunately, the multi-platform abilities never sold the language enough. Heck, according to Wikipedia, there's a Mac OS version (though not OS X or Snow Leopard). I even found the 'current' TrueBasic page while writing this note.

Eventually Visual Basic 1.0 came out and all the BASIC programmer, like myself, checked it out since it had Microsoft's name on it. Now, of course, 10 versions later, we've been steered over to the .Net platform while TrueBasic sits at V5.5.




回答28:


VBA - We spent a lot of time integrating it into our product. We still spend a lot of time on each new release to make sure that we don't break anything. VB6 and VBA is also COM based and that is a problem if you want to run as a standard user and not have write access to the registry.




回答29:


This addresses your discussion more than your question. I think you are assuming that the cost benefit of adopting new technologies is a given. For a very large corporation, changing technologies can cost hundreds of millions of dollars. If the cost benefit is not there then the hundreds of millions can be saved. Most companies use technology to make something else and can not afford to consume new technology simply because it exists. When the cost benefit is there, then it makes sense to do so.




回答30:


The TurboGears web framework

I had a web app to write and jumped onto this (having heard about it from a friend). I wasn't really aware of the alternatives, didn't know MVC properly and wasn't aware of the alternatives to the various 'standard' components (eg. SQLAlchemy instead of SQLObject). While the documentation and general state of the project is far better than it was when I got my hands dirty, I ended up with a huge application that relied on 'tricks' to bypass some of the magic features and had lots of undocumented features in it to meet the deadlines. It became a maintenance nightmare and I really wish I had taken the time to build something simpler with plans for a rewrite if the requirements changed.

This was 1.x series which has been deprecated now for the Pylons based 2.x series. As you can imagine, the core team itself decided on a rearch but I was stuck with a legacy application which I had to maintain.



来源:https://stackoverflow.com/questions/904238/have-you-ever-burned-your-hands-by-some-new-and-immature-technology

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!