Is my thermostat a computer?  :-)

No it is simple feedback device unless it is programable thermostat.

Suppose it's programmable.

Then yes, it is a computer. I think a more natural way of speaking would be to say that it has a computer in it.

Well, no; it has an embedded chip...that doesn't make it a computer, does it?

We have to start with definition. To most people 'computer' means personal computer and even if they think about a supercomputer they see a more powerful pc. If we however stick to the definition 'device that process data' then computer will have much broader meaning. ENIAC was a computer but it did not resemble present computers. A computer computes data therefore any device that does it can be called computer. A programmable thermostat has small computer inside and one of the more sophisticated thermostats might be more powerful than ENIAC.

I think a strong connotation of computer nowadays is that it is universal (ie. it can perform any computational task). A thermostat can be incredibly sophisticated but it will still only tell you when to turn on the heater. A pac-man machine will only play pac-man. But a computer can do either of those things, or much more, so long as you give it instructions on how to do so.

A computer used to mean a person that computed, eventually a person that computed using an adding machine. Many of these computers were women. The computations were often systems of differential equations (or other linear systems), for example, solving problems in ballistics.

I intend to give this page a serious working over as the result of some interesting discussions on talk:Konrad Zuse and on the other "history of computing" related pages. Robert Merkel

As the author of the page (though it has been improved somewhat since) i think a complete rewrite would be nice. I wrote it mostly in desperation that so important a topic had only a one like entry. The current page is better than that but not particularly good.

However, i suggest not deleting anything from the page until you have a complete article that covers all the important stuff already there (and hopefully more!). One way might be to rule a line at the top (or bottom) and start your rewrite in a seperate section. When you have enough there the old version could be removed.

I have seen a few other pages where mediocre articles were deleted by someone who then ran out of steam before completing their rewrite, leaving something worse than the original. Leaving both versions available during the transition protects somewhat against this disaster. Best of luck here!

I didn't see the above comment until I had committed my rewrite (it was actually a good idea you had, if somebody can restore the old article and hang it somewhere that'd be good).

(Done. It's at the end of the new one. New one is looking good!)

It is approximately half "feature-complete" at this point. Seeing we already have a great deal of other material on computing topics, I intend to concentrate merely on the "what is a computer" question, with very brief overviews of the other two subheadings.

Any suggestions (or just plain edits) on how to improve my explanation of why Turing-completeness is important would be appreciated. Robert Merkel

On the commercial computing side, data processing machines began to enter serious use circa 1895 in the USA and during the early 20th century many other places. These machines usually were based on punched cards and could do operations such as sorting, tabulating, counting, and in some cases calculating a total, but were probably not general enough to be considered computers. When computers became cheap enough to be competitive, computers took over because they can do all this, and have much more flexibility. Many of the technologies used in computers 1945-1970 were originally developed for earlier data processing machines and some of the same companies were involved (IBM and Burroughs, maybe Sperry, probably others in other countries). In the history section this seems somehow relevant, but you write so much better than me i leave it to you to decide if, or how, to add it.

Yes, the new one is really looking good! --LMS

I think the old version should be moved here. Also, even though the main article could be expanded almost without limit, it might be good to move or remove all the metacomments so that it will read like most of the other articles. David 19:49 Sep 21, 2002 (UTC)

"It's now commonplace for operating systems to include web browsers, text editors, e-mail programs, network interfaces, movie-players and other programs that were once quite exotic special-order programs." Is this realy true even in windows? In Linux vim or emacs, xine or mplayer, kmail, evolution mutt or pine ,konqueror galeon opera mozilla or firebird would never be considered part of the OS which most people would consider the kernel, likewise for OSX and its BSD derived kernel. Do people realy regard notepad and IE and outlook and windows media player as part of the OS - I know microsoft claims it for IE, but the rest? I'm sure Microsoft would like to claim almost any aplication where they have competition is "part of the OS" so they can happily bundle it with the OS, but bundeling with and being part of are two different things. This passage should be defended or removed, is there any example of an OS where eg the text editor is part of the OS? Generally where the functionality can be provided by an alternative program it cannot be considered part of the OS. The example of network interfaces is the one example of a function which has genuinely been taken over by the os as compared to a third party program (cf. trumpet winsock) Htaccess

I really like this article but feel it is missing an important piece about the user of computers to decode in WW2. Is that somewhere else and if so can we put in a link to it? --(talk)BozMo 13:37, 14 May 2004 (UTC)

Well, that's what most of the work in building an encyclopedia consists of, isn't it? :-) As a start, see maybe: Ultra . Kim Bruning 14:09, 14 May 2004 (UTC)

Oh dear. For some reason John von Neumann isn't mentioned in the definition section. Odd that. Other missing people are Alan Turing and the pair Alonzo Church and Stephen Kleene.

These folks provide 3 different detailed definitions of computer, all of which are currently in actual use in the field. :-)

The 3 POVs (with a very short summary (so YMMV) ) are:

These 3 definitions overlap:

  • A von Neumann architecture can simulate a turing machine, with the understanding that most implementations can't actually simulate an infinite length tape, just a very long one.
  • Originally people thought that von Neumann machines weren't particularly suited to performing lambda calculus. Over time people have gotten practice. Nowadays , a language like Ocaml might actually run faster than a language (like C) that was specifically designed for von Neumann architectures.
  • A Turing machine can be simulated in lambda calculus, and lambda calculus can be performed by a turing machine.

This means that all 3 POVs of are logically equivalent, but they do each bring a slightly different way of looking at computers to the table.

I guess I'd better look into this better sometime. Kim Bruning 23:06, 28 Jul 2004 (UTC)

Partially done, and extended this comment as a result. Kim Bruning 07:40, 29 Jul 2004 (UTC)

I just completely re-wrote the definition section. Turing and Von neumann now get mentions but church does not (maybe you could add him in a suitable place). Given the general title I tried to stay away from too much theory and leave the details for other pages..

I think the following would be useful:

  • A scematic diagram of a simple von neumann architecture to illustrate the "How a computer works section".
  • More information in the computer applications section
  • Also a sprinkling of more up to date references in various sections

I also moved a lot of etymology to here wiktionary:computer which seems a much beter place for it.

John Harris

What do I want from a computer? Well, love, but I've pretty much given up hope on that one. Ten years ago a guy named Steve made a computer called NeXT, and things still suck. This is intolerable. Why don't I have a minimal instruction set computing (MISC)-type box on my desk? Something like a Forth engine (cf. Chuck Moore's F21 CPU and similar), with Lisp as the app language (coded itself in machine Forth, of course). Forth can be implemented surprisingly efficiently with a stack machine. No register mess, which makes context switching basically painless and trivial. Easy, cheap, frequent procedure calling is encouraged, with implicit low coupling and modularity thanks to the ever-present stack. Programs are small and efficient because so few instructions are required on a complete stack machine. Oh, yes, while I'm at it, I'd like the whole thing implemented in asynchronous logic CMOS please, so I can implant it in my cerebral cortex and run it off body heat and the occasional shot of glucose. Sigh. Well, I'm a cs geek, but there are times I wish I was in comp. engg...

The picture below shows one of my MBs that fried mystriously, it was connected to a highedups which showed now power issures at all. But obliouvsly something went wrong. Any way thought the pic might be of use to someone. Belizian 20:26, 6 Oct 2004 (UTC)


Fried Motherboard, And it didnn't come from outside something with board itself caused this