I recently learned that my company prefers closed-source tools for privacy and security.

I don’t know whether the person who said that was just confused, but I am trying to come up with reasons to opt to closed-source for privacy.

  • navi
    link
    fedilink
    arrow-up
    5
    ·
    8 hours ago

    Security through obscurity isn’t security.

    The classic example:

    I have a website with no authentication which displays data that really should be locked down. But it’s OK because I never told anyone else the URL so no one will find it.

    • s38b35M5@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 hours ago

      I never told anyone else the URL so no one will find it.

      Who wants to tell them about DNS records and web crawlers?

  • wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    11 hours ago

    In my experience the “privacy and security” argument is a smokescreen.

    The real reason is that it makes someone else responsible for zero-days occuring, for the security of the tool, and for fixing security problems in the tool’s code. With open source tools the responsibility shifts to your cybersecurity team to at least audit the code.

    I don’t know about your workplace, but there’s no one qualified for that at my workplace.


    A good analogy: If you build your house yourself, you’re responsible for it meeting local building codes. If you pay someone else to build it, you can still have the same problems, but it’s the builder’s responsibility.

    • jim3692@discuss.onlineOP
      link
      fedilink
      arrow-up
      4
      ·
      10 hours ago

      That smokescreen argument makes a lot of sense. Both the company and our clients, tend to opt for ready out-of-the-box proprietary solutions, instead of taking responsibility of the maintenance.

      It doesn’t matter how bad or limiting that proprietary option is. As long as it somewhat fits our scenario and requires less code, it’s fine.

      • 0x0@programming.dev
        link
        fedilink
        arrow-up
        6
        ·
        10 hours ago

        instead of taking responsibility

        This is why, they prefer to shift the blame in case it hits the fan. That’s all, that’s it.
        They don’t care about code quality, maintainability or whatever.

      • serenissi@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 hours ago

        It doesn’t matter if the code is open here. Depending on what your company does, it might be cheaper to buy ready to use products by some vendor than paying software/sysadmin guys to review, deploy and maintain. It can be even required by law. Needless to say there are many software vendors selling contract for open software, either hosted or fully deployed and supported. Still in many fields like medical due to vendor lock ins there aren’t many feature complete open software and you need the programs to be reliable, usable by non technical people and virtually unchanged over long time. To provide these guarantees without depending on proprietary vendors means to make your own software company (and perhaps open up your work not to become just another closed software) and nobody does that.

        Security works kinda the same. But in these contexts if someone uses privacy and security together like this it’s probably just bs.

  • s38b35M5@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    8 hours ago

    My past employers have said the same, until I showed them they were already using apache, nginx, postgresql, MariaDB, and OpenWRT among other things.

    A lot of shops think that using proprietary tools means they can demand fixes for critical vulnerabilities, but in my experience, even proprietary dev teams just reply that the code maintainers are aware and working on a fix.

    Apache vuln? Here’s the link to their acknowledgment of that CVE and exactly what modules are affected.

    That may show that the flaw is in an unused module, like node.is, but even when it is applicable, they just wait for the code maintainers to address it. They take no responsibility themselves.

  • Libb@jlai.lu
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    12 hours ago

    I recently learned that my company prefers closed-source tools for privacy and security.

    I will suggest that same logic to my banker too: a vault whose key they won’t own, but I will. Don’t worry, all your money will be safe with me, it’s a promise 😇

  • unwarlikeExtortion@lemmy.ml
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    12 hours ago

    Cloased source does for privacy and security what sweeping problems under the rug does: it mitigates them, a bit, but then when they inevitably do hit, they hit hard.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    10 hours ago

    There is some logic here, having a business relationship with a party that now has a contractual duty to you, is a stronger guarantee than an open source project.

    For instance Windows is source available, to many businesses, so in one sense it’s open source, and the other sense is closed source. From a business perspective that’s a reasonable trade-off sometimes

  • UnfortunateShort@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    12 hours ago

    You can make an argument for confidentiality making it harder to find exploits in your code. If nobody cares enough to report them to you, or if you don’t have the resources to fix them, open-sourcing your code just exposes them.

    This is pretty much only an argument if you use stuff that would be irresponsible to use in the first place tho

    • JubilantJaguar@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 hours ago

      If nobody cares enough to report them to you, or if you don’t have the resources to fix them

      To be fair, this scenario does feel worryingly like it might be common.

  • Autonomous User@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    6 hours ago

    A very common strategy to divert blame away from yourself is, using fake security as a cover story, infecting yourself with anti-libre software, so you are banned from fixing its source code. Also, saying ‘open source’ is a strategy to derail libre software.