|
 Originally Posted by LCGuy
1. I have beta tested software for many years - I have seen and witnessed, first hand, alpha builds that look to be "complete" and perform fairly well, with TONS of bloat and debuggng code still in it (because, in the "alpha" mode, its a build on the old version with the beginnings of new features being added, piecemeal, to see if adding a feature specifically causes issues with certain systems - much better than enabling multiple/all new features at once and having to debug backwards), not to mention inefficiencies galore. They perform well on some tasks, but not well at all at others, and usually the UI is the most apparent of those inefficiencies. Those videos smack of it. Just my hunch, but, Im entitled, and fairly confident Im right.
How many years? 30? That's how long I've been doing it. I do agree that badly designed/coded/integrated/tested software runs like hell and is certainly not something to be demonstrated before a worldwide audience. I do not think a corporation like HP would really be so cavalier or amateurish as to do such a thing (although the old Palm might be). By the way, code with "TONS of bloat and debugging code in it" is pre-alpha software, not to be seen by anyone but the development team - at least in the professional software industry.
 Originally Posted by LCGuy
2. I don't appreciate the "fortune cookie proverb" comment. Rather insulting actually. Next.
Go re-read your #2 then. Perhaps non sequitur might be less insulting to you. In any case, it was about as informative as telling me that standing in the rain clothed will get my clothes wet. I apologize if you feel as insulted as my intelligence did after reading that statement.
 Originally Posted by LCGuy
3. Not sure I understand the "glittering personalities" inuendo, either... (maybe Im just too stupid to know when Im being insulted a second time in as many sentences, I dunno). Next.
Well glittering generalities describes a statement so void of information that it can mean pretty much anything one wants it to mean. In any case, your #3 did nothing to further your case.
 Originally Posted by LCGuy
4. Wrong conclusion. I can tell you, factually, that inefficient code can decrease the performance of even the most powerful, efficient processors. Seen it, witnessed it, and will likely see it again, many, many times, in my future testing capacities. (hint: it only takes one innocently enabled infinite loop running in the background  )
I didn't realize that we were including poorly developed software as the baseline for your CPU-loading example. Yes it's true - even cleanly written, efficiently formed, and professionally tested infinite loops can waste CPU cycles quite well. So what did that have to do with the TP's choppy performance again? I don't believe "inefficient code" should be used in a product's debut demonstration nor do I believe "innocently enabled infinite loops" should (or did) exist in a demonstration as important as HP's 9 Feb event - not if the software development team is the least bit professional. Pre-release code may benefit from some clean up and can always be improved, but if it's still "inefficient," it's not ready for a demo outside of the producing company.
 Originally Posted by LCGuy
5. You aren't understanding my point: true multitasking will suck up CPU time and system resources. Just look at your PC - open too many windows (which, of course, "too many" is relative to your system) and it will slow to a crawl, and THAT is real mulititasking.
An inefficiently designed and implemented true multitasking OS running poorly formed software will certainly slow to a crawl (re: Windows before 7). Well-designed and implemented OSes are a bit smarter about how their resources are shared among the hungry, CPU-demanding processes so that they don't "slow to a crawl" unless something is broken or a program is behaving badly (although they should handle that as well). This sort of negative result just shouldn't be a factor in systems where the hardware, OS, and software are all owned and operated by the same company - as in HP and its demo of the TP.
 Originally Posted by LCGuy
6. (this is your last comment) - Im not sure I get your point - so let me say this: its ALL about what the consumer percieves and can do with the device - most are not techno-geeks like many of us here are - they just want their devices to do what they want them to do, and when they don't, it gets the broad stroke paint of "it doesnt work". The iPhone doesnt truly multitask non-native apps, but, it DOES do what iPhone users apparently need, which is why its never criticized by the general public for it, but, remember "copy and paste" on the iPhone, and the public complaints for not having it?
My point was that WebOS devices will be able to truly multitask (not with cards as stagnant placeholders, but, open applications in either supsended state or running/performing tasks) and THAT is a compliment to the design and capabilities of the system, but, with this additional power comes the possiblities of overuse/abuse by the user, who wont understand its limitations, open a gazzillion cards at once and see a degrading performance. The device will then, unfairly and incorrectly, be judged by said user as "slow", "laggy", or "defective", so, aside from the addtional CPU horespower they are providing, it would also make sense for HP to allow preferences to be set by the users to help them decide on just how much of a performance hit they are willing to take in the name of true "multitasking".
You're right - I better understand what you meant at the end of your previous post with your clarification here. This is a situation that a well-built OS can prevent. Android OS, since Froyo's release (2.2), does this very well. As soon as it detects a demand for resources that exceeds currently available resources, it smoothly and silently frees up resources by parking or shutting down other active processes. That's how it supposed to work so that casual users and advanced users alike enjoy the same multitasking experience.
|
|
|