Like a privacy based fully open source browser. Wouldnt it be more hackable because every one know the script and is a glopal privacy based gpay alternative possible ?

  • Gabu@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    What you’re describing is effectively an attempt at security through obscurity, which doesn’t work. If an attacker is interested on a specific target, going through the source code searching for bugs isn’t too different from performing a black box attack.

  • Ziggurat@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    It’s a dual edged sword, everybody can look for vulnerability, it may-help some pirates, but it also means that everyone can volunteer to fix-it. To my understanding, professional security auditor concluded that (at least for big free projects) open-source is safer than closed source because more people fix bugs than exploit them

    • nobloat@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      FOSS generally puts more pressure on people to write better and safer code, because you know everyone is going to look at it. Even when vulnerabilities are found, they are usually fixed so fast compared to the proprietary side. There are stories of people waiting 6 months for Microsoft to fix a vulnerability, while an Openssh or openssl issue is usually fixed in a few days.

  • FuckyWucky [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Global privacy based Gpay alternative does exist, see monero. Monero is less than ideal because of its volatility though. Also, really difficult to convert to actual money.

    The thing with Gpay (American one) is that many a times they don’t handle the actual transfer, Visa and Mastercard does. For an

    Security through obscurity doesn’t really work well. Look at how Windows is closed source yet most malware is made for it due to its.market share.

    Also, Chrome is kinda open source (not completely but look at Chromium).

    With open source, if there is a security issue, it can be identified and fixed quickly. If the dev doesn’t fix it, you or someone else could fork the project and patch it.

    • dsemy@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Comparing Gpay to Monero makes no sense, one is a system for making payments and the other is a currency.

      There is however an actual alternative - GNU Taler.

  • CALIGVLA@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Yes and no. While people can read the code and potentially exploit it, the opposite is also true. Having full access to it means others can find flaws in the code you’ve missed and contribute improvements to it. Being closed-source can be a detriment in that regard, so much so companies often rely on external security audits and the like.

  • Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Neither closed source nor open source is a guarantee of a quality code base.

    There are white hat (good person) as well as black hat hackers.

    If everyone can see the source code, there are more eyes able to spot problems and fix them.

    • taladar@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      And someone can fork the codebase if the original author or current maintainer refuses to fix major issues. Closed source software vendors refuse to do so quite frequently.

  • bad_news@lemmy.billiam.net
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    If a tree falls in the woods and nobody is there to hear it, does it still make a sound? The exploitable bug is still there whether or not it’s hidden by temporary obscurity, and who knows what state-aligned hackers know, so it’s better to just get things fixed ASAP via it all being open.

  • CanadaPlus@futurology.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I’m annoyed that most of the answers are just “no”.

    It’s actually a great question, but practical experience has shown that closed-source software is just as buggy when written, and only slightly harder for an attacker to figure out, but much much harder to fix. And that’s not even talking about deliberate anti-features, like every app that hoovers up your data and sells it so you can order a pizza.

  • GarbageShoot [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Security through obscurity is a notoriously sophomoric strategy that won’t keep out a dedicated attacker

    That and some major proprietary software has had built-in backdoors for decades at this point, I’m pretty sure (I think this is more of a Windows than an Apple thing, but Apple has its own issues)

  • Zevlen@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Good question. I’ve always wondered in general about open source and hackability. Like is Linux much more hackable then Windows or Mac ?

    • xmunk@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      It depends on the distro but generally Linux is much more secure because it doesn’t have backdoors built in for weird shit like codecs and registration.

  • dsemy@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    You don’t need the source code to find vulnerabilities.

    To fix them you almost always need access to the source code.

  • (╯°□°)╯︵ ┻━┻@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Putting aside the idea that everyone can read the code and find/fix exploits, sometimes it’s good to be open to vulnerabilities because you know where/what to fix.

    Sometimes closed code it’s exploited for years before the owner or general public finds out, and that leads to more problems.

  • 𝘋𝘪𝘳𝘬@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    The fact that most hacked software is closed source (i.e. Windows and most Windows tools) proves that open source software is not lees secure.

  • Th4tGuyII@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    As long as there us incentive to do so, malicious actors will exploit the source code whether it is open or closed…

    Making something open source does make it easier for malicious actors, but it also allows honest actors to find and fix exploits before they can be used - something they won’t/can’t do for closed source, meaning you have to rely on in-house devs to review/find/fix everything.

    • Phen@lemmy.eco.br
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I work in a open source software and I can tell you for sure that this only works in theory, at least for projects that aren’t giant like chromium. If I push some code to a new branch on github there will be people looking at the changes before I even had time to open a PR, but very obvious security flaws can stay in the code for several years before anyone reports it. The amount of people looking for things to exploit is just much larger than the white hats. Sure they could still find the same flaws without access to the source, but we’re making if easier for them.

  • Rentlar@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    They are equally exploitable, but those exploits are generally easier to find and fix on open source software than closed.

    As an example, look at the exploit chain Apple had only patched recently, “TriangleDB”. The exploit relied on several security flaws and undocumented functions, and it was used extensively in state-sponsored malware such as Pegasus for years. If any part of the exploit chain were patched, the malware wouldn’t have worked. It took a Russian cybersecurity firm a significant effort to track down how the exploit worked when they found out they were being targeted by the malware.