• Treczoks@kbin.social
    link
    fedilink
    arrow-up
    32
    ·
    1 year ago

    Decades ago, the TV took five minutes to warm the tubes up before one could watch the news.

    Today, the TV takes five minutes to boot, install updates, and mangle some configuration before one (eventually) can watch the news - if the TV has not lost it’s list of stations again.

    • Link.wav [he/him]@beehaw.orgOP
      link
      fedilink
      arrow-up
      25
      ·
      1 year ago

      By the mid 80s and 90s, CRTVs took just seconds to show output on the screen. Even the really old tube TV my grandma had would warm up within seconds.

      • Treczoks@kbin.social
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        I once got gifted a TV from a nice elderly guy. The TV had been edge of technology when it was built: It had a wireless remote! Although the remote worked with ultrasound instead of infrared…

        This beast took several minutes before it actually showed a picture.

        • Link.wav [he/him]@beehaw.orgOP
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          Must’ve been a REALLY old one. I’m old as dirt, and they’ve taken mere seconds all my life. Even fast TVs now take longer to show a picture than the console ones we had when I was a kid, although I did see some from the 50s and 60s that took quite some time.

          • magnetosphere @beehaw.org
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            I used to have one of those black plastic (or was it Bakelite?) Space-Commander 400 remotes, pictured in the black and white ad.

            I was walking home from grade school. Somebody was getting rid of their ancient TV, and had left it on the curb. The boxy, awkwardly shaped remote was in its “holster” on the TV, so I grabbed it and took it home. Before then, I had assumed that only infrared wireless remotes existed.

            The idea that a remote could work by ultrasound fascinated me, and the fact that it didn’t even need batteries absolutely blew my little mind.

            • Treczoks@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Let me tell you how shitty they were, and why they probably put this thing to the curb:

              The “Receiver” part of that thing was so limited, that it basically interpreted all kinds of ultrasonic sounds as “commands”. Whenever I pulled my curtains open or close, the TV went nuts. It turned off, or it turned the volume to 11, or whatever. I was working on a small piece of metal on my desk, and with every stroke it changed channels, either up or down. This thing was annoying.

  • apis@beehaw.org
    link
    fedilink
    English
    arrow-up
    26
    ·
    1 year ago

    Feels like everything is much more a faff to set up, then one bit updates & something or other is longer compatible.

    Don’t even want to think about the waste it must generate, both of devices & of the hours trying to get things to work whether at the development end or in the home.

    • Swedneck@discuss.tchncs.de
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      at this point i don’t understand why people bother with TVs rather than just hooking up an actual normal computer to a big screen and just watching youtube or torrenting media

  • navigatron@beehaw.org
    link
    fedilink
    arrow-up
    24
    ·
    1 year ago

    I tell my laptop to put the video in the vga port. It does. That’s it. There’s nothing plugged in, but it’s there.

    I plug a vga cable in. There’s video in there now. With enough paperclips, I could get it out the other end. My laptop does not care. It wiggles the electrons regardless.

    I plug the other end of the cable in. The shielding was eaten by mice and two pins are dead. But alas, lo and behold, purple tho it may be - the video comes out and is displayed.

    Meanwhile, hdmi protocol negotiation wants to know if you’d like to set your screen as the default sound device. Not that teams would use it anyway. Actually nevermind, the receiving end doesn’t support the correct copyright protection suite. Get fucked, no video for you.

  • Queue@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    Flatscreens: Set it gently on the table, or you might break the screen.

    CRTs: Set it gently on the table, or you might break the table.

    4K HDR 120 Hertz, that we can easily put up surround sound is great. But fuck, you sneeze wrong and it gets a weird scanline issue, like mine.

    • Semi-Hemi-Demigod@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Even old flat screens are ridiculously heavy compared to new ones. I replaced an old Sony 720p screen that weight probably 20 pounds with a 1080p smart TV of the same size that I could lift one-handed. And the new one cost less than $200.

      • jarfil@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I somewhat miss my old Sony 720p screen… it came with a full electronics diagram in case you wanted to repair it.

  • ConsciousCode@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    1 year ago

    I grew up with CRTs and VCRs, hard pass. There’s a certain nostalgia to it all: the bum-DOOON sound as its electron gun warmed up, the smell of ozone and tingly sensation that got exponentially stronger the closer you were, crusty visuals… But they were objectively orders of magnitude worse than what we have now, if nothing else than because they don’t weigh 150 pounds or make you wonder if watching Rugrats in Paris for the 30th time on this monster is giving you cancer. Maybe it’s because I’m techie, I’ve never really had much issue with “smart” TVs. Sure, apps will slow down or crash because of memory leaks and it’s not as customizable as I’d like, but I might be satiated just knowing that if push comes to shove I can plug in a spare computer and use it like a monitor for a media system.

    I’m rooting it if it starts serving me out-of-band ads, though.

      • thehellrocc@beehaw.org
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        They don’t seem to have a lecturing tone in their comment. The only part which you might have a point about is where they say “objectively”, but throughout the whole comment they’re really just expressing their opinion and showing their experience with smart TVs, which they’re entitled to have and might be different from yours.

        No aggressiveness intended. Just trying to keep the niceness around.

  • nocturne213@lemm.ee
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    I tried using the “smart” of my tv once, it was so slow and laggy i plugged in my 7+ year old Roku and never touched the smart again.

  • Old TVs were shit though. The quality was a mess even with cable boxes. Pushing 4k TV through the air requires fancier signal processing, and fancy signal processing requires software. Asking consumers to pay more for good software is asking for them to switch to the competition, so you get buggy pieces of shit for great prices. As for casting, good luck getting that video on your phone to your analogue 625i CRT.

    You can get the old timey experience back. First, you get one of these boxes:

    Then you plug in your analogue input to the SCART input at the back, set the right options for the region your analogue signal is from (NTSC/PAL/SECAM), and plug your HDMI cable into the output on the right. You can then ignore the remote that came with your TV and use your analogue devices on your huge modern flat screen in all its 625i@50fps blobby glory.

    • All of those issues are covered by other devices that most people will already have. An XBox360/PS3 or any newer gaming system can output 4k and make the smart features in the TV unnecessary. The same is true of a cable box, Roku plugin, fire stick, or any other streaming device. All the TV really needs is to display the 4k signal it receives. TVs don’t even really need receivers anymore just a USB hub, a processor for video and audio output, and a screen.

      • Skull giver@popplesburger.hilciferous.nl
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 year ago

        But that’s exactly what TVs do? The buggy crap is all in the video casting features, the wireless display crap, and the apps you install onto it.

        If you just want a display with ports, get yourself a nice and big monitor. It’s a tad expensive because those things aren’t subsidised by ads and tracking and paid-for services like so many cheap TVs are, but they’ll do nothing but display a signal you feed it.

    • apis@beehaw.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I was happy with the quality, and don’t get more enjoyment from all the advancements since, but only ever remember plugging it into the wall, plugging an aerial into the back of it & pressing one button to get the tuner to pick up channels. Batteries into the remote once that became a thing. Plug in a VCR or DVD player once they appeared.

      No need for a phone line or internet or updates.

    • Link.wav [he/him]@beehaw.orgOP
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I was fine with the quality of old TVs ¯\_(ツ)_/¯

      And no thank you, I’m not going to do all that. I don’t care enough about any shows to go through all that hassle. I just want my TV to work without extra expense, and I will complain when it doesn’t because I hate big corporations and I want them to fail.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      625i@50fps blobby

      The comparison between CRT and digital is not as simple as “625 vs 4k”. Those analog signals were intended for a triangular subpixel shadow mask with no concept of horizontal pixel count, making them effectively into ∞×625i@50fps signals (1), compared to the digital fixed 3840×2160@60fps square pixels regardless of subpixel arrangement.

      It takes about a 6x6 square pixel block to correctly represent a triangular subpixel mask, making 4K LCDs about the minimum required to properly view those 625i@50fps signals.

      (1) I’m aware of optics limitations, CoC, quantum effects, and ground TV carrier band limitations, but still.

      • Given a high enough quality signal and fast enough switching hardware inside the TV, you’re right. In practice, real world shadow masks had a fixed resolution. It doesn’t matter how fine your control over the electron beam is if you’re still only capable of lighting colour phosphors of a limited resolution (between 500-800 depending on how crappy/good your TV was, several times that for some widescreen CRTs). You can apply some trickery to partially light the individual dots on the screen in low-light environments, but that requires the transmitter and receiver to map the input signal to the same shadow mask/aperture grille or you’ll mess up the colours. Infinite horizontal resolution only works for black and white displays.

        Emulating the exact pixel arrangement on a digital display would require some absurd resolution for sure, but back in the day the input signal rarely ever had that kind of resolution in the first place. No need to set up an 8000 lines horizontal camera when the people watching your video only see 800. Very few things you can still hook up to a TV will produce more than the low SDTV resolution because the world moved to analogue. Even LaserDisc, the greatest analogue format to hit the home market, has a horizontal resolution of about 440 lines.

  • DSLeMaster@beehaw.org
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    I feel like I’ve missed something. I don’t dispute any of the horrible experiences people have had, however I’ve had nothing but good luck. The only thing about our current television that bothers me is the promotional wallpapers that get applied every-fucking-time a new Disney property needs advertising. We buy relatively modestly priced units in the $300-$500, so maybe we just have different expectations than someone buying a much more high end unit. It is also possible that it has been pure luck and I’ll reply to this message one day soon to recant everything.

    • interolivary@beehaw.org
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      1 year ago

      The TVs you buy force you to see ads even when you’re not watching a program and you’re like “I’ve had great luck”?

      • Some people don’t base their entire personality around hating the existence of ads and jumping through outrageous amounts of steps to avoid them.

        So yeah, count me in for a TV that always works how I want it to but had a background ad that I can completely ignore and has no actual bearing on my life.

        There are so many more important things for me to spend my time and energy worrying about.

        • interolivary@beehaw.org
          link
          fedilink
          arrow-up
          10
          ·
          edit-2
          1 year ago

          Ah yes, the only two options: complete apathy, or basing your entire personality around hating ads. Did I like hit a nerve or something, or why did you get so defensive over a joke?

          • Sorry, definitely touched a nerve.

            I am really enjoying Lemmy but the vast amount of users that are proponents of piracy/ad blocking without being open to ANY discourse is really frustrating and gets super annoying.

            There is a post about Blockbuster phasing out DVD’s. One commenter said since the enshittification of streaming services they and their friends have been buying more physical media so not being able to go to Blockbuster kind of sucks. Some users immediate reaction was to call them out for “being stupid” and providing a link for free streaming (aka piracy) and that “as long as you have ublock then you’re good to go”.

            There are people here on Lemmy that will one second scream " work reform! Unionize! Pay me a living wage!" And then immediately turn around and say “Yea, I’m going to do everything in my power to prevent these creators of media from making money. Adblock + PiHole + VPN + Piracy is the only way to use the internet. They don’t deserve my money”

            I frequently pirate movies, AFTER I have purchased a physical copy of said movie. Someday I hope that we can use something like the blockchain to tie ownership to digital files regardless of platform but at the moment that’s a pipe dream.

            So in the mean time, I will have my ads or I will pay my monthly fee’s, because I want people to be paid for what they created. Even if it’s Disney or Marvel or insert giant company here they created something that didn’t exist before that I enjoy. They deserve to make money off that.

            • interolivary@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Oh yeah I definitely agree with you about how it’s downright myopic to be against all forms of ads, considering the economic system we find ourselves in. Especially silly are people who both complain about ads on news sites and how news is often very clickbait-y due to having to chase those as views, but who also refuse to pay for news. Same goes for any ad-funded service, really; you’re not getting it for free, you’re “paying with your attention”, ie. ad impressions. The idea that we’re entitled to free content is ridiculously selfish. Sure, I think it sort of sucks that ads are so pervasive and can affect business models negatively, but them’s the breaks when content creators and service providers need to eat too. Ads just happen to be a much easier source of income than subscriptions or let alone voluntary donations (and it’s not like everybody can afford to eg pay for news or whatever.) Doesn’t mean it’s not possible to fund things without ads, but it’s obviously more dependable in many cases than other options, or it wouldn’t be so popular. In previous conversations I’ve had smart-asses say how saying that is supposedly ironic considering I’m on a platform supported by voluntary donations, like the existence of eg. Beehaw means that ads are completely unnecessary in all situations for all services, and everything could just run on voluntary donations or subscriptions. Sure worked out well for news media, didn’t it (the answer to that has been “hur dur mainstream media sucks so why would I pay for it”, totally oblivious to the fact that some of it sucks exactly because nobody wanted to pay for it anymore.)

              So yeah I definitely share your irritation at how frankly stupid some people’s attitude towards ads is, and how incredibly entitled it is to think that we’re somehow owed free content and services. Ads may be irritating but as I said, with this economic system this is how it is and we’re just going to have to suck it up. Refusing to expose your delicate sensibilities to any ads out of principle isn’t going to make them go away as a concept, and it definitely isn’t going to pay for goods or services that you now see for free.

              However, what – at least to me – feels different about TVs or other such doodads showing you ads is that you already paid for the product and you’re still getting ads. I guess it can be a way to lower the price of the product, but considering how they’re often not all that much cheaper it honestly feels more like just squeezing more money out of the users at the expense of user experience. Ads in otherwise free services do make sense, but this feels more, I dunno, predatory?

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      promotional wallpapers

      What’s that? I mean, I know what a wallpaper is, but when would a TV display one of those?

  • Plume (She/Her)@beehaw.org
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    I hate smart sutff so much. It’s fucking impossible to find a good TV that don’t have all of this shit thrown in now. I just want a nice display. And that’s it. But what’s worse, is when, not only it comes with awful software but they also take “the Apple route” for their features/services. So you have an issue like this and you can’t do anything about it.

    What am I talking about, you say? “The Apple way”, but I allso like to call it the “fucking magic” syndrome. The “fucking magic” syndrome is when something is supposed to be “magic”, to “just work”, BUT WHEN IT DOESN’T… you’re shit out of luck. :)

    Because you see, it’s supposed to just work. It’s absolutely inconceivable for it to not just work. So the people who made it and never even for a second considered that it might fail, never took the time to implement some kind of failsafe in the UI to allow you to actually force the thing to do it’s thing on the off chance where the rabbit just refuses to come out of its hat. So when

    Anyone who’s had to update their Airpods knows exactly what I’m talking about. They’re supposed to update themselves, without you doing anything. But every now and then… THEY FUCKING DON’T AND THERE IS ABSOLUTELY ZERO WAY TO FORCE THEM TO DO SO! You just have to wait with your Airpods in their cases open for the moon to be in the correct position or something.

    • Link.wav [he/him]@beehaw.orgOP
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Unfortunately streaming has become the norm, and cable’s no longer affordable

      I’d rather go outside if that’s how it’s gonna be

      • TWeaK@lemm.ee
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        You don’t have to let your TV do the streaming though, it just needs to play it. Lots of other devices can connect to streaming services, eg gaming consoles and DVD players. Me, I have a media PC that does all the fun stuff (and lets me stream my library while I’m away), but you could easily use an old laptop.

        • Onii-Chan@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          This is how I do it too. I bought a mini office PC and HDMI’d it to the TV. It’s nothing fancy, just an i5, a wifi card and IHD GPU. I threw Kodi onto it, cancelled every streaming service I was using, and then returned to the high seas to fill up the 10TB external HDD I connected. When a new episode of something drops, I just download it and then Kodi nicely organizes everything.

          • TWeaK@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Absolutely, I didn’t mean to imply you were wrong for calling them out.

              • TWeaK@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                You know, you don’t have to engage with everyone who comments on your post. Sounds like you’re taking on a little bit too much. Sorry for my part in that, but I hope you find a way to ease any stress you’re under.

    • ConsciousCode@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      This is less an issue of “smartness” and moreso because analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place. HDMI hits kind of a weird spot because it’s a digital protocol based on analog scanlines; if the signal gets disrupted for 0.02 ms, it might only affect the upper half and maybe shift the bits for the lower half. Digital is more contextual and it will resynchronize at least every frame, so this kind of degradation is also unstable.

      • jarfil@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place

        Not really. Digital signals come over analog mediums, and it’s up to the receiver to decide how much degradation is too much. Mitigations like error correction are intended to reduce the final errors to zero, but it’s up to the device to decide whether it shows/plays something with some errors, and how many of them, or if it switches to a “signal lost” mode.

        For example, compressed digital video has a relatively high level of graceful degradation: full frames come every Nth frame and they are further subdivided into blocks, each block can fail to be decoded on its own without impacting the rest, then intermediate frames only encode block changes, so as long as the decoder manages to locate the header of a key frame, it can show a partial image that gets progressively more garbled until the next key frame. Even if it misses a key frame, it can freeze the output until it manages to locate another one.

        Digital audio is more sensitive to non-corrected errors, that can cause high frequency and high volume screeches. Those need more mitigations like filtering to a normalized volume and frequency distribution based on the preceding blocks, but still allow a level of graceful degradation.