terabyte memory

Early today I mis-typed a letter with terabyte memory in place of hard drive size. Many people now saying and laughing that terabyte computer memory doesn’t exist. Can you linux engineers and users help out? Does such a system exist? These people are apple users.
Thanks from a happy suse user.

You mean as a single stick? I don’t think there is a market for a 1TB RAM stick. However a cluster of computers could easily contain that much RAM, see this blog entry:

Ever Wondered What A Terabyte of RAM Looked Like?

Obviously not your home computer.

just depends on need and money available:

this server, for example has “up to 2 terabytes of system RAM per
system” http://www.acclinet.com/ibm-servers/ibm-780-server.asp

there are many other examples…background:

http://itcommunity.intel.co.uk/community/uk/blog/2009/08/17/1-terabyte-of-ram-in-a-4-socket-server-anyone;jsessionid=135D3AAB8FAFC5F07EA3E5CFA5DBA8A6.node5ITG
http://www.cyberciti.biz/tips/linux-ramback-patch.html
http://www.google.com/search?q=Terabyte+of+RAM

and, i’d guess that several governments of earth have operated
multi-terabyte-RAM computer clusters for many many years…


DenverD (Linux Counter 282315)
CAVEAT: http://is.gd/bpoMD
posted via NNTP w/TBird 2.0.0.23 | KDE 3.5.7 | openSUSE 10.3
2.6.22.19-0.4-default SMP i686
AMD Athlon 1 GB RAM | GeForce FX 5500 | ASRock K8Upgrade-760GX |
CMedia 9761 AC’97 Audio

Many modern computers have > 1 terabyte of memory. The computer I am typing on now has 1.5 terabytes.

I have offline additional storage (in external hard drives) of another 2.5 tera bytes of storage (in multiple drives).

At the office, where I work, they have petabytes of storage (in many dozens of multiple devices) for the satellite and product imagery.

Of course if you mistakenly noted you had 250 terabytes of storage on a PC, I could see that would cause some laughter.

To put this in perspective read this: Exabyte - Wikipedia, the free encyclopedia

According to CSIRO, in the next decade, astronomers expect to be processing 10 petabytes of data every hour from the Square Kilometre Array telescope.[7] The array is thus expected to generate approximately one exabyte every four days of operation.

and there are more examples here: Petabyte - Wikipedia, the free encyclopedia

Film: The 2009 movie Avatar is reported to have taken over 1 petabyte of local storage at Weta Digital for the rendering of the 3D CGI effects

Kingston Memory jokingly said last April first they had the upper hand on making the first 250GB memory sticks that would make a Terabyte PC consisting of only 4 sticks possible just as soon as the motherboard makers & Processor makers get on-board.

When asked if these new chips would be cost effective they said “Oh about $13 / GB or a mere $13000”. The anticipated size of each 250GB block would be about a 10" spiral so these desktops could be as small as about a small sofa.

Of course this was their April fools joke but they did say in their defense that they are working ever harder at higher integration so maybe in a few years it might be possible.

techwiz03 wrote:
> maybe in a few years it might be possible.

already possible, just not yet for the personal computer market:

“The density of information of this memory is 1GB/square millimeter
having the thickness of the recording media (array) of 10 microns.
60GB of non-volatile RAM are contained in one cubic millimeter. Also,
this Quantum-Optical technology allows to build up to 256GB in one
package / or module up to 35TB.”

cite: http://atomchip.com/_wsn/page3.html

see photo: http://atomchip.com/_wsn/page2.html


DenverD (Linux Counter 282315)
CAVEAT: http://is.gd/bpoMD
posted via NNTP w/TBird 2.0.0.23 | KDE 3.5.7 | openSUSE 10.3
2.6.22.19-0.4-default SMP i686
AMD Athlon 1 GB RAM | GeForce FX 5500 | ASRock K8Upgrade-760GX |
CMedia 9761 AC’97 Audio

Kingston was refering to standard monolythic memory compatible to standard computers of the micro, mini, and mainframe class.

What you are refering to is Quantum memory or more properly Photonics sourced through laser light technologies. We are many years away from the kind of breakthrough since they (University of Oxford) are still trying as of March 2010 to achieve the first Quantum Photonic Processor capable of interfacing with nanomicron Quatum-Optic chips. Todate, they can flash store up to 256GB per chip and reverse the process to flash read the same 256GB but to be useful they need better lazers that can be controlled quantumly to bias the data stream so as to only access some random part of the information. But it does show future promise.

You have 1.5 terabytes of memory in your desktop computer? :sarcastic:

Please provide a link where I can buy the motherboard your using, and a link to the memory modules.

I’d kill for such a system on my desktop.

Samsung is just now releasing their 32GB memory modules for servers that provide for up to 192GB demanding the same power as 12 4GB modules and Intel, ASUS, ABIT have bank switched motherboard protypes coming out that are designed to utilize 32GB main-memory and 15 selectable banks of 32GB each giving total density of 64GB at one time, and overall memory of 512GB.

This is similar to what I did back in 1975. The DGS Z-80 machine at that time was limited to 64MB which was constructed as 8 cards of 8 banks of 8 - 2102 (1k x 1bit) memories. We needed more memory since Harddisks weren’t readily available so I constructed bank swichers to control 7 boards at a time for a total density of 7 x 8K x 256 + original 8k which was over 14MB addressed through 1 PPI and the standard 64K memory buss. Was great potential back at that time, but so expensive we never got above 128MB. 3 Years later the PC came out with 640kb memory for about the same money as we spent on the memory alone.

I think the OP and oldcpu are refering to 1TB/1.5TB HDD since modern OS’s use swap + RAM + HDD all under the term memory.

I’d kill for 1TB main RAM too but I guess I’ll have to wait for a 32TB Quantum RAM and CPU to drop below the $19 million price tag.

Some of us come from a generation where “memory” was used for total RAM + storage :wink:

What generation is that?

Been building computers since 1985, I don’t remember when hard drives were considered “memory”.

That said I was just poking a little fun at oldcpu, no harm intended.

But I still want 1.5 terabytes of memory lol!

I have almost 50 years of memory !! ;). When I was trained as a UNIX admin, the trainers insisted that we regard anything that stores (temp or long term) is memory. Memory is divided in volatile and non-volatile. Harddisks, tapes, CD’s etc belong to the second category. Just think of what HDD’s do, they ‘remember’ don’t they ?

yep got more than 50 years of memory but can’t figure out how to replace the failing ones behide my eye’s :open_mouth:

And yes I remember instructors saying “to make life easy we will call all storage forms memory in general but add the distinction that we have non-volatile, volatile, and Alzheimer’d to cover memory of either type with bad blocks or banks”.

On Wed, 2010-05-12 at 01:36 +0000, oilpaint wrote:
> Early today I mis-typed a letter with terabyte memory in place of hard
> drive size. Many people now saying and laughing that terabyte computer
> memory doesn’t exist. Can you linux engineers and users help out? Does
> such a system exist? These people are apple users.
> Thanks from a happy suse user.
>
>

Well… I’m doubtful. Running that amount of memory requires a lot of
$$$$$$$'s and power and real estate.

So… I’ll say “no”… not practically anyhow.

Ok… googling I got:
http://www.tpc.org/results/individual_results/IBM/IBM_595_32_20050412_es.pdf

But still… this kind of config would be VERY rare in the “real” world.

techwiz03 wrote:
> yep got more than 50 years of memory

youngster.


DenverD (Linux Counter 282315)
CAVEAT: http://is.gd/bpoMD
posted via NNTP w/TBird 2.0.0.23 | KDE 3.5.7 | openSUSE 10.3
2.6.22.19-0.4-default SMP i686
AMD Athlon 1 GB RAM | GeForce FX 5500 | ASRock K8Upgrade-760GX |
CMedia 9761 AC’97 Audio

Well ya and there used to be paper tape and punched cards to, but lets just be practical lol!

I punch holes in toast, eat it and wait till I remember things I did not know before :wink:

Done some reading last night: it seems nobody really knows our own memory capacity. Found figures going from 2 TB via 1000 TB to “almost infinite”. I like to think mine is the latter …:wink:

In the article “The human pneumatic storage - a study of infinite biochemical computer memory” published back in early 1970’s that I was introduced to in school, it was conjectured that if a computer could achieve real growing brain cells as a form of memory current (remember this was decades ago) computers could see about 100GB of storage in a very young brain and if that brain matured and followed human evolution it would peek at about 25years at over 2.8 Terabytes for male brain. The female brain would peek at 22years at about 2.1 Terabytes. They eluded that females mature faster than males but males tend to grasp and retain more detailed information.

Not too sure now whether the title term was pneumatic or neromatic.

But at any rate, I think if we do achieve such a phenomenal storage method, how would we keep our compters from drinking cause that burns brain matter faster than a streaker eluding police.rotfl!