How do devices like the Game Boy Advance achieve their frame rate?











up vote
1
down vote

favorite












I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



Thanks.










share|improve this question
























  • "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.
    – Oldfart
    6 hours ago










  • "I would still be bottlenecked transferring the display data to that." Do you mean you believe you'd be bottlenecked transferring data to the graphics processor, or from the graphics processor to the display?
    – TimWescott
    6 hours ago










  • Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.
    – KingDuken
    6 hours ago












  • Most of the responses seem to be missing the key difference between these systems with high bandwidth screen interfaces, and the poster's attempt with a bit serial interfaced display.
    – Chris Stratton
    5 hours ago






  • 1




    buy the display without the controller on it and make your own controller
    – old_timer
    4 hours ago















up vote
1
down vote

favorite












I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



Thanks.










share|improve this question
























  • "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.
    – Oldfart
    6 hours ago










  • "I would still be bottlenecked transferring the display data to that." Do you mean you believe you'd be bottlenecked transferring data to the graphics processor, or from the graphics processor to the display?
    – TimWescott
    6 hours ago










  • Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.
    – KingDuken
    6 hours ago












  • Most of the responses seem to be missing the key difference between these systems with high bandwidth screen interfaces, and the poster's attempt with a bit serial interfaced display.
    – Chris Stratton
    5 hours ago






  • 1




    buy the display without the controller on it and make your own controller
    – old_timer
    4 hours ago













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



Thanks.










share|improve this question















I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



Thanks.







avr oled graphics






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 5 hours ago









SamGibson

10.8k41537




10.8k41537










asked 6 hours ago









MalphasWats

1489




1489












  • "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.
    – Oldfart
    6 hours ago










  • "I would still be bottlenecked transferring the display data to that." Do you mean you believe you'd be bottlenecked transferring data to the graphics processor, or from the graphics processor to the display?
    – TimWescott
    6 hours ago










  • Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.
    – KingDuken
    6 hours ago












  • Most of the responses seem to be missing the key difference between these systems with high bandwidth screen interfaces, and the poster's attempt with a bit serial interfaced display.
    – Chris Stratton
    5 hours ago






  • 1




    buy the display without the controller on it and make your own controller
    – old_timer
    4 hours ago


















  • "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.
    – Oldfart
    6 hours ago










  • "I would still be bottlenecked transferring the display data to that." Do you mean you believe you'd be bottlenecked transferring data to the graphics processor, or from the graphics processor to the display?
    – TimWescott
    6 hours ago










  • Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.
    – KingDuken
    6 hours ago












  • Most of the responses seem to be missing the key difference between these systems with high bandwidth screen interfaces, and the poster's attempt with a bit serial interfaced display.
    – Chris Stratton
    5 hours ago






  • 1




    buy the display without the controller on it and make your own controller
    – old_timer
    4 hours ago
















"I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.
– Oldfart
6 hours ago




"I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.
– Oldfart
6 hours ago












"I would still be bottlenecked transferring the display data to that." Do you mean you believe you'd be bottlenecked transferring data to the graphics processor, or from the graphics processor to the display?
– TimWescott
6 hours ago




"I would still be bottlenecked transferring the display data to that." Do you mean you believe you'd be bottlenecked transferring data to the graphics processor, or from the graphics processor to the display?
– TimWescott
6 hours ago












Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.
– KingDuken
6 hours ago






Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.
– KingDuken
6 hours ago














Most of the responses seem to be missing the key difference between these systems with high bandwidth screen interfaces, and the poster's attempt with a bit serial interfaced display.
– Chris Stratton
5 hours ago




Most of the responses seem to be missing the key difference between these systems with high bandwidth screen interfaces, and the poster's attempt with a bit serial interfaced display.
– Chris Stratton
5 hours ago




1




1




buy the display without the controller on it and make your own controller
– old_timer
4 hours ago




buy the display without the controller on it and make your own controller
– old_timer
4 hours ago










5 Answers
5






active

oldest

votes

















up vote
3
down vote













"My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






share|improve this answer




























    up vote
    3
    down vote













    The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



    The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



    The video processor can then work pixel by pixel to determine which sprite to draw at that point.



    However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



    (1) Notable exception: Amiga






    share|improve this answer





















    • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
      – TimWescott
      4 hours ago












    • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
      – Mr.Mindor
      1 hour ago










    • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
      – TimWescott
      1 hour ago


















    up vote
    2
    down vote














    how did a device like the GBA achieve a frame rate of nearly 60fps?




    Hardware.



    It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



    You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



    With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



    But no matter the cpu, bit banging the interface to the display is going to suck.



    I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






    share|improve this answer





















    • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
      – pjc50
      5 hours ago










    • @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
      – Mark
      1 hour ago


















    up vote
    1
    down vote













    The gba had a pretty slow processor, the arm7 is very nice they just ran it slow and gave it next to no resources. There is a reason why a lot of nintendo games at that point and before were side scrollers. HARDWARE. It is all done in hardware, you had multiple layers of tiles plus one or more sprites the hardware did all the work to extract pixels from those tables and drive the display. what you did was build the tile set up front and then had a smallish memory that was a tile map want the lower left tile to be tile 7 you put a 7 in that memory location want the next tile over to be tile 19 in the tile set you put a 19 there and so on, for each layer that you have enabled. for the sprite you simply set the x/y address. you can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



    mode 7 if I remember right was a pixel mode but that was like a traditional video card where you put bytes in that cover the color for a pixel the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them but i dont remember right, again the processor was fairly underclocked for that day and age and didnt have too many fast resources. so while some games were mode 7 a lot were tile based side scrollers...



    If you want a solution that is a high frame rate you need to design that solution you dont just take any old display you find and talk to via spi or i2c or something like that put put at least one framebuffer in front of it ideally two and have row and column control if possible over that display. a number of the displays I suspect you are buying have a controller on them that you are actually talking to, you want gba/console type performance you create/implement the controller. Or you buy/build with a gpu/video chip/logic blob, and use hdmi or other common interface into a stock monitor.



    just because a bicycle has tires and a chain and gears doesnt mean it can go as fast as a motorcycle, you need to design the system to meet your performance needs, end to end. can put that bicycle wheel on that motorcycle but it wont perform as desired, all of the components have to be part of the overall design.



    asteroids worked this way too, only needed one 6502, the vector graphics was done with separate logic, the 6502 sent a tiny string of data to the vector graphics controller, which used a rom and that data to do the xy plotting of the beam and z, on/off...some standups had separate processors to handle audio and video separate from the procesor computing the game. and of course today the video is handled by some to hundreds if not thousands of processors that are separate from the main processor...






    share|improve this answer




























      up vote
      0
      down vote













      Other answers cover your question pretty well at an abstract level (Hardware) but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



      The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video ram, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)
      It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around it's behavior, because it will continue doing what it does whether you were ready or not.



      Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



      That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



      The 20% between frames, was all the CPU had to tweak video settings or ram that would impact the whole next frame.



      At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular) You have to be careful to keep these updates small and finish before graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPUs processing time..



      For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break, each were chasing the other around the loop updating what the other wasn't currently looking at.






      share|improve this answer








      New contributor




      Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.


















        Your Answer





        StackExchange.ifUsing("editor", function () {
        return StackExchange.using("mathjaxEditing", function () {
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["\$", "\$"]]);
        });
        });
        }, "mathjax-editing");

        StackExchange.ifUsing("editor", function () {
        return StackExchange.using("schematics", function () {
        StackExchange.schematics.init();
        });
        }, "cicuitlab");

        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "135"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });














        draft saved

        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2felectronics.stackexchange.com%2fquestions%2f412696%2fhow-do-devices-like-the-game-boy-advance-achieve-their-frame-rate%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        5 Answers
        5






        active

        oldest

        votes








        5 Answers
        5






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes








        up vote
        3
        down vote













        "My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



        To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






        share|improve this answer

























          up vote
          3
          down vote













          "My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



          To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






          share|improve this answer























            up vote
            3
            down vote










            up vote
            3
            down vote









            "My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



            To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






            share|improve this answer












            "My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



            To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered 6 hours ago









            TimWescott

            2,43429




            2,43429
























                up vote
                3
                down vote













                The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



                The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



                The video processor can then work pixel by pixel to determine which sprite to draw at that point.



                However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



                (1) Notable exception: Amiga






                share|improve this answer





















                • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
                  – TimWescott
                  4 hours ago












                • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
                  – Mr.Mindor
                  1 hour ago










                • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
                  – TimWescott
                  1 hour ago















                up vote
                3
                down vote













                The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



                The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



                The video processor can then work pixel by pixel to determine which sprite to draw at that point.



                However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



                (1) Notable exception: Amiga






                share|improve this answer





















                • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
                  – TimWescott
                  4 hours ago












                • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
                  – Mr.Mindor
                  1 hour ago










                • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
                  – TimWescott
                  1 hour ago













                up vote
                3
                down vote










                up vote
                3
                down vote









                The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



                The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



                The video processor can then work pixel by pixel to determine which sprite to draw at that point.



                However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



                (1) Notable exception: Amiga






                share|improve this answer












                The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



                The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



                The video processor can then work pixel by pixel to determine which sprite to draw at that point.



                However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



                (1) Notable exception: Amiga







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 6 hours ago









                pjc50

                33.2k33982




                33.2k33982












                • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
                  – TimWescott
                  4 hours ago












                • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
                  – Mr.Mindor
                  1 hour ago










                • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
                  – TimWescott
                  1 hour ago


















                • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
                  – TimWescott
                  4 hours ago












                • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
                  – Mr.Mindor
                  1 hour ago










                • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
                  – TimWescott
                  1 hour ago
















                Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
                – TimWescott
                4 hours ago






                Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.
                – TimWescott
                4 hours ago














                @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
                – Mr.Mindor
                1 hour ago




                @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.
                – Mr.Mindor
                1 hour ago












                @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
                – TimWescott
                1 hour ago




                @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.
                – TimWescott
                1 hour ago










                up vote
                2
                down vote














                how did a device like the GBA achieve a frame rate of nearly 60fps?




                Hardware.



                It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



                You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



                With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



                But no matter the cpu, bit banging the interface to the display is going to suck.



                I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






                share|improve this answer





















                • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
                  – pjc50
                  5 hours ago










                • @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
                  – Mark
                  1 hour ago















                up vote
                2
                down vote














                how did a device like the GBA achieve a frame rate of nearly 60fps?




                Hardware.



                It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



                You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



                With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



                But no matter the cpu, bit banging the interface to the display is going to suck.



                I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






                share|improve this answer





















                • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
                  – pjc50
                  5 hours ago










                • @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
                  – Mark
                  1 hour ago













                up vote
                2
                down vote










                up vote
                2
                down vote










                how did a device like the GBA achieve a frame rate of nearly 60fps?




                Hardware.



                It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



                You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



                With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



                But no matter the cpu, bit banging the interface to the display is going to suck.



                I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






                share|improve this answer













                how did a device like the GBA achieve a frame rate of nearly 60fps?




                Hardware.



                It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



                You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



                With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



                But no matter the cpu, bit banging the interface to the display is going to suck.



                I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 6 hours ago









                peufeu

                24.3k23972




                24.3k23972












                • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
                  – pjc50
                  5 hours ago










                • @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
                  – Mark
                  1 hour ago


















                • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
                  – pjc50
                  5 hours ago










                • @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
                  – Mark
                  1 hour ago
















                Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
                – pjc50
                5 hours ago




                Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)
                – pjc50
                5 hours ago












                @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
                – Mark
                1 hour ago




                @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.
                – Mark
                1 hour ago










                up vote
                1
                down vote













                The gba had a pretty slow processor, the arm7 is very nice they just ran it slow and gave it next to no resources. There is a reason why a lot of nintendo games at that point and before were side scrollers. HARDWARE. It is all done in hardware, you had multiple layers of tiles plus one or more sprites the hardware did all the work to extract pixels from those tables and drive the display. what you did was build the tile set up front and then had a smallish memory that was a tile map want the lower left tile to be tile 7 you put a 7 in that memory location want the next tile over to be tile 19 in the tile set you put a 19 there and so on, for each layer that you have enabled. for the sprite you simply set the x/y address. you can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



                mode 7 if I remember right was a pixel mode but that was like a traditional video card where you put bytes in that cover the color for a pixel the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them but i dont remember right, again the processor was fairly underclocked for that day and age and didnt have too many fast resources. so while some games were mode 7 a lot were tile based side scrollers...



                If you want a solution that is a high frame rate you need to design that solution you dont just take any old display you find and talk to via spi or i2c or something like that put put at least one framebuffer in front of it ideally two and have row and column control if possible over that display. a number of the displays I suspect you are buying have a controller on them that you are actually talking to, you want gba/console type performance you create/implement the controller. Or you buy/build with a gpu/video chip/logic blob, and use hdmi or other common interface into a stock monitor.



                just because a bicycle has tires and a chain and gears doesnt mean it can go as fast as a motorcycle, you need to design the system to meet your performance needs, end to end. can put that bicycle wheel on that motorcycle but it wont perform as desired, all of the components have to be part of the overall design.



                asteroids worked this way too, only needed one 6502, the vector graphics was done with separate logic, the 6502 sent a tiny string of data to the vector graphics controller, which used a rom and that data to do the xy plotting of the beam and z, on/off...some standups had separate processors to handle audio and video separate from the procesor computing the game. and of course today the video is handled by some to hundreds if not thousands of processors that are separate from the main processor...






                share|improve this answer

























                  up vote
                  1
                  down vote













                  The gba had a pretty slow processor, the arm7 is very nice they just ran it slow and gave it next to no resources. There is a reason why a lot of nintendo games at that point and before were side scrollers. HARDWARE. It is all done in hardware, you had multiple layers of tiles plus one or more sprites the hardware did all the work to extract pixels from those tables and drive the display. what you did was build the tile set up front and then had a smallish memory that was a tile map want the lower left tile to be tile 7 you put a 7 in that memory location want the next tile over to be tile 19 in the tile set you put a 19 there and so on, for each layer that you have enabled. for the sprite you simply set the x/y address. you can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



                  mode 7 if I remember right was a pixel mode but that was like a traditional video card where you put bytes in that cover the color for a pixel the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them but i dont remember right, again the processor was fairly underclocked for that day and age and didnt have too many fast resources. so while some games were mode 7 a lot were tile based side scrollers...



                  If you want a solution that is a high frame rate you need to design that solution you dont just take any old display you find and talk to via spi or i2c or something like that put put at least one framebuffer in front of it ideally two and have row and column control if possible over that display. a number of the displays I suspect you are buying have a controller on them that you are actually talking to, you want gba/console type performance you create/implement the controller. Or you buy/build with a gpu/video chip/logic blob, and use hdmi or other common interface into a stock monitor.



                  just because a bicycle has tires and a chain and gears doesnt mean it can go as fast as a motorcycle, you need to design the system to meet your performance needs, end to end. can put that bicycle wheel on that motorcycle but it wont perform as desired, all of the components have to be part of the overall design.



                  asteroids worked this way too, only needed one 6502, the vector graphics was done with separate logic, the 6502 sent a tiny string of data to the vector graphics controller, which used a rom and that data to do the xy plotting of the beam and z, on/off...some standups had separate processors to handle audio and video separate from the procesor computing the game. and of course today the video is handled by some to hundreds if not thousands of processors that are separate from the main processor...






                  share|improve this answer























                    up vote
                    1
                    down vote










                    up vote
                    1
                    down vote









                    The gba had a pretty slow processor, the arm7 is very nice they just ran it slow and gave it next to no resources. There is a reason why a lot of nintendo games at that point and before were side scrollers. HARDWARE. It is all done in hardware, you had multiple layers of tiles plus one or more sprites the hardware did all the work to extract pixels from those tables and drive the display. what you did was build the tile set up front and then had a smallish memory that was a tile map want the lower left tile to be tile 7 you put a 7 in that memory location want the next tile over to be tile 19 in the tile set you put a 19 there and so on, for each layer that you have enabled. for the sprite you simply set the x/y address. you can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



                    mode 7 if I remember right was a pixel mode but that was like a traditional video card where you put bytes in that cover the color for a pixel the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them but i dont remember right, again the processor was fairly underclocked for that day and age and didnt have too many fast resources. so while some games were mode 7 a lot were tile based side scrollers...



                    If you want a solution that is a high frame rate you need to design that solution you dont just take any old display you find and talk to via spi or i2c or something like that put put at least one framebuffer in front of it ideally two and have row and column control if possible over that display. a number of the displays I suspect you are buying have a controller on them that you are actually talking to, you want gba/console type performance you create/implement the controller. Or you buy/build with a gpu/video chip/logic blob, and use hdmi or other common interface into a stock monitor.



                    just because a bicycle has tires and a chain and gears doesnt mean it can go as fast as a motorcycle, you need to design the system to meet your performance needs, end to end. can put that bicycle wheel on that motorcycle but it wont perform as desired, all of the components have to be part of the overall design.



                    asteroids worked this way too, only needed one 6502, the vector graphics was done with separate logic, the 6502 sent a tiny string of data to the vector graphics controller, which used a rom and that data to do the xy plotting of the beam and z, on/off...some standups had separate processors to handle audio and video separate from the procesor computing the game. and of course today the video is handled by some to hundreds if not thousands of processors that are separate from the main processor...






                    share|improve this answer












                    The gba had a pretty slow processor, the arm7 is very nice they just ran it slow and gave it next to no resources. There is a reason why a lot of nintendo games at that point and before were side scrollers. HARDWARE. It is all done in hardware, you had multiple layers of tiles plus one or more sprites the hardware did all the work to extract pixels from those tables and drive the display. what you did was build the tile set up front and then had a smallish memory that was a tile map want the lower left tile to be tile 7 you put a 7 in that memory location want the next tile over to be tile 19 in the tile set you put a 19 there and so on, for each layer that you have enabled. for the sprite you simply set the x/y address. you can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



                    mode 7 if I remember right was a pixel mode but that was like a traditional video card where you put bytes in that cover the color for a pixel the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them but i dont remember right, again the processor was fairly underclocked for that day and age and didnt have too many fast resources. so while some games were mode 7 a lot were tile based side scrollers...



                    If you want a solution that is a high frame rate you need to design that solution you dont just take any old display you find and talk to via spi or i2c or something like that put put at least one framebuffer in front of it ideally two and have row and column control if possible over that display. a number of the displays I suspect you are buying have a controller on them that you are actually talking to, you want gba/console type performance you create/implement the controller. Or you buy/build with a gpu/video chip/logic blob, and use hdmi or other common interface into a stock monitor.



                    just because a bicycle has tires and a chain and gears doesnt mean it can go as fast as a motorcycle, you need to design the system to meet your performance needs, end to end. can put that bicycle wheel on that motorcycle but it wont perform as desired, all of the components have to be part of the overall design.



                    asteroids worked this way too, only needed one 6502, the vector graphics was done with separate logic, the 6502 sent a tiny string of data to the vector graphics controller, which used a rom and that data to do the xy plotting of the beam and z, on/off...some standups had separate processors to handle audio and video separate from the procesor computing the game. and of course today the video is handled by some to hundreds if not thousands of processors that are separate from the main processor...







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered 4 hours ago









                    old_timer

                    5,7421524




                    5,7421524






















                        up vote
                        0
                        down vote













                        Other answers cover your question pretty well at an abstract level (Hardware) but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



                        The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video ram, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)
                        It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around it's behavior, because it will continue doing what it does whether you were ready or not.



                        Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



                        That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



                        The 20% between frames, was all the CPU had to tweak video settings or ram that would impact the whole next frame.



                        At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular) You have to be careful to keep these updates small and finish before graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPUs processing time..



                        For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break, each were chasing the other around the loop updating what the other wasn't currently looking at.






                        share|improve this answer








                        New contributor




                        Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                        Check out our Code of Conduct.






















                          up vote
                          0
                          down vote













                          Other answers cover your question pretty well at an abstract level (Hardware) but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



                          The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video ram, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)
                          It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around it's behavior, because it will continue doing what it does whether you were ready or not.



                          Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



                          That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



                          The 20% between frames, was all the CPU had to tweak video settings or ram that would impact the whole next frame.



                          At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular) You have to be careful to keep these updates small and finish before graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPUs processing time..



                          For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break, each were chasing the other around the loop updating what the other wasn't currently looking at.






                          share|improve this answer








                          New contributor




                          Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                          Check out our Code of Conduct.




















                            up vote
                            0
                            down vote










                            up vote
                            0
                            down vote









                            Other answers cover your question pretty well at an abstract level (Hardware) but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



                            The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video ram, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)
                            It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around it's behavior, because it will continue doing what it does whether you were ready or not.



                            Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



                            That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



                            The 20% between frames, was all the CPU had to tweak video settings or ram that would impact the whole next frame.



                            At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular) You have to be careful to keep these updates small and finish before graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPUs processing time..



                            For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break, each were chasing the other around the loop updating what the other wasn't currently looking at.






                            share|improve this answer








                            New contributor




                            Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.









                            Other answers cover your question pretty well at an abstract level (Hardware) but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



                            The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video ram, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)
                            It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around it's behavior, because it will continue doing what it does whether you were ready or not.



                            Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



                            That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



                            The 20% between frames, was all the CPU had to tweak video settings or ram that would impact the whole next frame.



                            At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular) You have to be careful to keep these updates small and finish before graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPUs processing time..



                            For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break, each were chasing the other around the loop updating what the other wasn't currently looking at.







                            share|improve this answer








                            New contributor




                            Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.









                            share|improve this answer



                            share|improve this answer






                            New contributor




                            Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.









                            answered 10 mins ago









                            Mr.Mindor

                            1012




                            1012




                            New contributor




                            Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.





                            New contributor





                            Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.






                            Mr.Mindor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.






























                                draft saved

                                draft discarded




















































                                Thanks for contributing an answer to Electrical Engineering Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                Use MathJax to format equations. MathJax reference.


                                To learn more, see our tips on writing great answers.





                                Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                                Please pay close attention to the following guidance:


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2felectronics.stackexchange.com%2fquestions%2f412696%2fhow-do-devices-like-the-game-boy-advance-achieve-their-frame-rate%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                404 Error Contact Form 7 ajax form submitting

                                How to know if a Active Directory user can login interactively

                                Refactoring coordinates for Minecraft Pi buildings written in Python