Violet (my birth flower) + 08 (my birth year). Pretty basic.

  • over_clox@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    3 days ago

    Probably so, I got zapped quite a few times myself.

    That’s probably what’s wrong with me ain’t it? 🤔

      • over_clox@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 day ago

        It was officially rated for a max of 1280x1024, 50Hz progressive. The more common resolution of choice back then was 1024x768, 60Hz.

        Well I wanted a much higher resolution than that, so by absolutely maxing out the horizontal output transistor frequency to 64KHz and doing some quick number crunching, I was able to make a custom display mode of 2048x1536, 25Hz interlaced.

        Although the vertical refresh rate got both cut in half and also switched to interlaced, it still absolutely qualifies as overclocking, because the horizontal output transistor (HOT) is basically the most stressed out non high voltage component in any CRT monitor.

        Running the HOT at the bleeding edge of the manufacturer’s frequency rating of 64KHz could and would indeed burn it out prematurely if run too long like that, especially without additional cooling, so I didn’t run it too damn long like that, but I just wanted to see if it was even possible.

        The monitor itself was from 1994, so it was effectively all analog, no digital onscreen menus, no signal checking and no error message on the screen to say boohoo video mode not supported, the monitor just tuned itself to the signal and frequencies I calculated and arranged for it.

        It did however have quite a few analog image transform buttons on its front panel though, for things like trapezoid and shear distortion, raster rotation, corner bowing, etc, lots of things most monitors from 1994 didn’t have, which meant the CRT yoke had probably twice as many deflection coils as a regular consumer CRT.

        Not bad for a monitor from 1994 that probably never saw anything over 1024x768 in practical use before I ever acquired it. I got literally 4 times the pixels of the typical desktop resolution of the time.

        Was it worth it? For daily use, no. For learning experience, absolutely!

              • over_clox@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                1 day ago

                I long ago lost that monitor and related hardware over the sands of time, moving one place to another multiple times.

                The benefit though was that I effectively quadrupled the number of pixels on screen from the common 1024x768 resolution of the time.

                1024x2=2048, 768x2=1536

                I was basically pioneering early extra high definition video output before it was even a thing.

                The images themselves wouldn’t look any different, except smaller as each pixel was only 1/4 of original size, giving me a much larger visible pixel area for image editing.

                It wouldn’t have helped gameplay much though, as I had to sacrifice framerate to accomplish that.

                Edit: You definitely can’t do shit like that on modern LCDs, that category of overclocking is exclusive to old CRTs.

                  • over_clox@lemmy.world
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    edit-2
                    1 day ago

                    You’ve actually read this far?

                    Thank you for appreciating my words and the time it took to type them.

                    Sam’s Photofacts: https://repairfaq.org/

                    Edit: I am not Sam, this is just a treasure trove of information on how to troubleshoot and repair discreet electronics.

                    Sam Goldwasser is to me almost as important as Linus Torvalds and Richard Stallman…