TheHingineer

  • Operating System


  • OS Part-1

  • OS Part-2

  • OS Part-3

  • OS Part-4

  • OS Part-5

  • Cache Memory Organization in Operating System

    🔹 Introduction

    Cache Memory ek high-speed memory hoti hai jo CPU ke aur main memory (RAM) ke beech ka gap bridge karti hai. Iska main purpose data access time ko reduce karna aur system performance ko improve karna hota hai.

    ✅ Why Cache Memory?

    • CPU ki processing speed bahut fast hoti hai, lekin RAM us speed me data nahi fetch kar sakti.

    • Agar CPU directly RAM se data fetch kare to system slow ho jata hai.

    • Cache memory fast data access provide karti hai jo CPU ke speed ke close hoti hai.

    👉 Memory Hierarchy in a System

    CPU Registers → Cache Memory → Main Memory (RAM) → Secondary Storage (HDD/SSD)

    ✔ Cache Memory CPU ke close hoti hai aur sabse fast hoti hai.
    ✔ Iska size limited hota hai, isliye frequently used data store karti hai.


    🔹 Cache Memory Working

    Cache Memory CPU ki requests ko handle karti hai using principle of locality:

    📌 Principle of Locality

    1. Temporal Locality: Agar CPU ek data ko access karta hai, to chances hain ki wo data phir se access hoga.

    2. Spatial Locality: Agar CPU ek memory location ko access karta hai, to uske nearby locations bhi access ho sakte hain.

    📌 Cache Hit & Cache Miss

    ConditionExplanation
    Cache HitJab required data cache memory me hota hai
    Cache MissJab required data cache memory me nahi hota, to wo RAM se fetch hota hai

    👉 Diagram: Cache Working Process

    CPU Request → Check in Cache
        → If Found (Cache Hit) → Use Data
        → If Not Found (Cache Miss) → Fetch from RAM → Store in Cache → Use Data
     

    🔹 Cache Mapping Techniques

    Cache memory me data ko efficiently store karne ke liye mapping techniques ka use hota hai:

    Mapping TechniqueWorkingPros & Cons
    Direct MappingEach block of main memory ek fixed cache block ko map hota haiFast but less flexible
    Associative MappingKoi bhi memory block kisi bhi cache block me store ho sakta haiFlexible but complex
    Set-Associative MappingBlocks ko sets me divide kiya jata hai aur ek block kisi bhi set me store ho sakta haiBalance between speed & flexibility

    1️⃣ Direct Mapping

    ✔ Har memory block ka ek fixed cache block hota hai.
    ✔ Simple and fast, lekin collisions zyada hote hain.

    👉 Example:

    Main Memory Block 5 → Cache Block 1
    Main Memory Block 10 → Cache Block 2
    Main Memory Block 15 → Cache Block 3
     

    2️⃣ Fully Associative Mapping

    ✔ Koi bhi memory block kisi bhi cache block me store ho sakta hai.
    ✔ Collisions avoid hote hain, lekin searching slow hoti hai.

    3️⃣ Set-Associative Mapping

    ✔ Memory blocks ko groups me divide kiya jata hai.
    ✔ Ek group me multiple blocks ho sakte hain, jo collision kam karta hai.
    ✔ Fast aur flexible combination hai Direct aur Associative Mapping ka.


    🔹 Cache Replacement Policies

    Jab cache full hota hai aur naye data ko store karna hota hai, tab replacement policies ka use hota hai:

    PolicyWorking
    FIFO (First In First Out)Jo block sabse pehle aaya tha, usko replace karte hain
    LRU (Least Recently Used)Jo block sabse kam use hua hai, usko replace karte hain
    LFU (Least Frequently Used)Jo block sabse kam access hua hai, usko replace karte hain

    🔹 Multi-Level Cache

    Modern processors multiple levels of cache use karte hain:

    Cache LevelCharacteristics
    L1 Cache (Level 1)Fastest but smallest (Few KBs)
    L2 Cache (Level 2)Slower than L1 but larger (Few MBs)
    L3 Cache (Level 3)Slowest but largest among caches (Several MBs)

    👉 Diagram: Multi-Level Cache

    CPU → L1 Cache → L2 Cache → L3 Cache → RAM → HDD/SSD

    ✔ L1 sabse fast hota hai, phir L2, aur phir L3.
    ✔ Ye hierarchy CPU performance ko optimize karti hai.


    🔹 Advantages of Cache Memory

    ✅ Fast data access, CPU performance improve hoti hai
    ✅ Less dependency on RAM, execution speed fast hota hai
    ✅ Reduces system latency
    ✅ Efficient power consumption


    🔹 Disadvantages of Cache Memory

    ❌ Limited size, zyada data store nahi ho sakta
    ❌ Expensive as compared to RAM
    ❌ If cache miss occurs, performance degrade ho sakti hai


    🔹 Conclusion

    ✔ Cache Memory ek high-speed storage hai jo CPU aur RAM ke beech gap ko reduce karti hai.
    ✔ Direct, Associative, aur Set-Associative mapping techniques ka use hota hai.
    ✔ Multiple levels of cache (L1, L2, L3) performance ko optimize karte hain.
    ✔ Cache replacement policies efficiency improve karti hain.

    Scroll to Top