RAM is used to load applications and files into a non-persistent and fast storage so that the processor can rapidly access the data when it needs it. RAM is considered to be one of the fastest and cheapest ways to increase the performance of you computer system as well.
Before we talk about how to address memory, we need to have a better understanding of what memory is and what it does for us inside of our computer system. When we talked about processors, we talked about the fact that the processor does all the computations, but it has to have a place to store data and instructions before it does those computations.
Inside of the processor itself, it has a very high speed memory known as a cache. This cache is a very small amount of space, but it is extremely fast. Now when you run out of cache space, the next type of memory you have in your system is random access memory or system memory. This memory is still fast but not nearly as fast as that cache memory.
Now what ends up happening is the processor is going to use the cache memory first, and as it uses the data from that cache memory, new information is going to move from the system memory into the cache and it keeps moving this as a pipeline process going from the system memory to the cache and then being processed and executed using the CPU.
In addition to the system memory, we also have something known as storage. And storage is things like hard drives, USB drives, CD ROMs, DVDs and things like that. These mass storage devices will be able to hold a lot of more data than memory can, but they are much slower.
So if I want to load up a new PowerPoint on my system, it’s usually stored on my hard disk, which is a mass stoage device. When i tell my computer i want to read that, the processor is going to send a signal over to the hard drive and say ‘where is that data?’ That data then moves from the hard drive to the system memory, and from the system memory small pieces of that data can be brought into the cache and then worked on by the processor. This is how this all works together inside your system.
So as we’re working with different files we’re going to place them temporarily into RAM, that way we can work with them in a much faster manner. And then we done with them they’ll get saved back to the hard drive, which is that permanent storage device.
You have much, much less RAM available than you do a hard disk. So what the memory does is it acts as a disk cache. This way we can pull files from the disk into memory, work on them and then replace them back onto the mass storage device when we’e done. This lets the RAM act as a faster temporary storage area for those recently used and commonly used pieces of data from that hard disk to allow us to have quicker operations.
This is because RAM is going to be able to seek out data really, really fast, whereas a traditional hard disk has to actually spin a disk until it finds the right part and then pull the data of of it. And this is much slower, we call this a mechanical system, whereas we use RAM, we’re using an electronic system that can access any piece of that RAM with near instantaneous feed.
When the processor needs to reach into RAM and get something this is known as addressing the memory.
Now between the CPU and the memory, we have something known as a memory controller. And in between the memory controller and the processor a bus. Now a bus is simply a pathway for us to transfer data and that memory controller is almost like a traffic cop directing traffic and telling the CPU how to access different parts of the memory.
Now when you look at the bus, there’s actually two parts of the bus. There’s the pathway that’s used for data and being able to send and receive information, and then there’s an address pathway as well that helps us determine where in the memory that data is located.