1. OCAU Merchandise is available! Check out our 20th Anniversary Mugs, Classic Logo Shirts and much more! Discussion in this thread.
    Dismiss Notice

Is 5G even possible in Australia?

Discussion in 'Networking, Telephony & Internet' started by blondie_hunter, Aug 19, 2019.

  1. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,801
    Location:
    Brisbane
    Why even have the AABill then? What criminal will agree to police spying on their phone?

    And how does the Bill attempt to answer the question of getting access to a device without the owner's knowledge? Again, back in copper-phone-line days, that was pretty easy. Today, not so much. Either you are "bypassing protections" (my phone is encrypted and locked with a complex password, and will data-self-destruct in several unlock attempts), or you're not doing squat.

    Again, what's the point of the bill then? "We can tap phones only if crims let us"?

    It's clear from the wording that "non weakening" code - new code that's vendor sanctioned and doesn't rely on exploits, can be used to load keyloggers. What happens when these tools are leaked? (And yeah, they'll be leaked). And again, I'd be highly surprised if these tools were being loaded in person, manually, on to individual devices.

    Point is that EternalBlue could have been plugged YEARS earlier. I repeat - there was a day when the NSA did the right thing and told vendors about holes in their code. Now, they sit on it and exploit it.

    So no, I can't blame the NSA for the hole. But I can, and do blame them for not disclosing the hole in a reasonable amount of time. Months - fine. Years - no.

    Valid questions are not hysteria. People are people, and they fuck up. I've said several times I'm not anti the idea of the modern day equivalent of "wire taps" or key loggers or whatever, but so far there has been no evidence that whatever tool is being used to do so doesn't fall into the wrong hands.

    Again, back in the good old days, a leak of these tools resulted in minimal, localised damage. The concern of a great deal of people is what happens when tools that allow keylogging of a given brand of device that is being sold in the millions leak.

    Beyond that, all the "protections" in place are legal ones. They're there to stop the police from using exploits in what we call a "Swiss Army Knife" approach. The police must follow the rules, not remote exploit the system, not load the keylogger without permission (that last one I'm still dubious about how it actually works in practice when surveilling in secret, but whatever).

    But again, what happens when the tool leaks. Bad guys don't follow the rules. Bad guys stack exploits on top of each other to get to things with whatever tools they have. And a keylogger good enough for the feds to use is a pretty bloody juicy tool on top of a stack of basic remote pwn exploit code.

    I'm willing to talk through your points here. Thus far you appear to be caught up on the legal limitations of EternalBlue, and the practical limitations of Odin, and not seeing the issue of what new tools the AABill could give birth too, and how they could be abused or leaked. And yes, I'm fully willing to concede the point that the private sector needs to come to the party and protect their shit better. But at the same time, there's a level of responsibility on the government if they're going to will these tools into existence in the first place by crafting bills that mandate them.

    In short, I struggle to trust a government (and in particular Dutton, who as an individual removed from any party or belief structure, continues to prove himself utterly untrustworthy and constantly acts in bad faith) that couldn't keep a filing cabinet full of secrets off eBay with writing laws that safely deal with something as complex as this issue. The bill, as it stands, leaves questions unanswered. Important questions that neither the bill nor your posts answer. And I'm quite afraid that there'll have to be a few "I told you so" type events coming from security advisers within the coming years, given typical trends in computing over the last decade.
     
  2. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    What do you mean? The AA Bills turns the "unauthorised" third party, into an authorised third party. That's the whole point of the bill. Bypassing protections is a new power granted by the AA Bill - as long as it only affects a targeted person, not a system.

    And if they did, would all that have happened, is that the criminals did their attacks years earlier anyway? What was the advantage?

    Here we go again... If done correctly by the new powers granted, damage, if any, will be minimal and localised. if a tool that allowed keylogging of a given brand of DEVICE is leaked, it's not a permissible tool. The clue is in the all caps word - It's targeting a SYSTEM, not a PERSON, and is therefore a systemic weakness. (Or vulnerability)

    From the legislation:

    So, a tool that can be installed on any Samsung S11, or Apple iPhone XI, is a systemic vulnerability that is forbidden. But if it's a tool that can only be distributed to a single named person, it is permitted.

    Well, really, all of these protections are legal. There's nothing stopping police from exploiting vulnerabilities. In fact, I helped them do so several years ago when I worked as an ethical hacker. They can load keyloggers without permission. They did so before these rules were in place. The AA Bill is only about how far you are permitted to compel a commercial third party to do it on your behalf, and the limit is pretty low.

    We keep going over this again. You're assuming this legislation can compel the creation of super tools that criminals don't have access to now. It can't. So to literally answer your question - nothing. Nothing will happen, because the only tools that can be created, are ones that can only impact the security of one targeted individual named in the TCN.

    I'd prefer those two topics weren't even brought up, because neither of them are permitted under the new laws, and thus not good examples. You might as well say "What about handing every cop a Glock 19? The new AA Bill could let every cop shot a criminal dead with no repercussions" Well no, it can't, it doesn't offer that power. It doesn't offer the power to force the creation of another EternalBlue (There are no laws preventing ME, a private citizen from doing so, but neither does the AA Bill give any additional power to compel), it doesn't offer the power to compel the creation of Odin, so why bring it up? I'm not "hung up" over it, you are - You seem to think the bill enables them to do these things, when it can't.
     
  3. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,801
    Location:
    Brisbane
    How does the AABill guarantee, technically, not legally, that a single tool can't be re-used on multiple people? And if it can't, what happens if and when that tool is leaked?

    The bill's ultimate goal is to enable access to devices through technology. Technology that didn't exist last year, and will by now, because the bill demanded it to be created. Tools that no vendor would commercially be willing to make, because it would be financial suicide to reveal to your customers that you can break into their devices. Tools that had to be forced to be created by this very bill.

    Your argument is that the bill is neutral to the unlawful use of the tools it willed into existence. My argument is that, without the bill, the tools never would have existed, and are excessively risky just in their existence alone.

    I'm not worried about the bill today (well, I am, as per the "300K accesses in one year and too many of them without good reason" post, but let's ignore that for now). I'm worried about what our country (and the world, if these tools affect all regional models) looks like after this bill forces the creation of tools that are likely to fall into the wrong hands, given every example of exactly that happening over and over again.

    Your argument is that the bill is good enough to keep the feds honest. My argument is that the bill has already caused damage by forcing the creation of tools that allow access to devices in ways that shouldn't exist, and are currently very juicy targets for anyone who likes the idea of breaking in to other people's phones illegally.

    I'm far less interested in the legal ramifications and regulations on police of this bill. I'm far more interested in the tools created to satisfy the requirements of this bill, and what happens when they are leaked.

    I'm less interested in good guys who obey the bill. I'm more interested in bad guys who can profit from the tools that exist thanks to this bill. Bad guys who don't follow the rules, and prior to now didn't have the resources to design similar tools.

    And as much as this is the wording of the bill, I'm at a complete loss on how you could achieve this, technically. How does one ensure, technically not legally, that a piece of software like a keylogger is able to be limited in its "installability" to a single named user, or single device?

    And honestly, these are the sorts of comments I hear from true experts on the topic. As much as the bill has all these wonderful words in it to ensure checks and balances, nobody appears to have any idea how to ensure at a technical level that both (a) access is granted when requested, and (b) access forbidden at all other times.

    What's likely to end up happening is, to avoid fines and legal problems, companies just rush this stuff out and err on the side of "access". I suspect the same sort of cheap, rushed logic that results in public S3 buckets, "username admin, password admin" credentials, and other security whoopsies that put access over and above compliance when under time pressure.

    If words in bills and laws were followed to the letter all the time, there'd be no crime to fight in the first place. My issue is very much what happens when this all goes titsup. Because, as humans, we're really good at fucking these things up.
     
    Last edited: Aug 20, 2019
  4. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    Because a company can LEGALLY refuse to create a tool that can be re-used on multiple people. How will you force them, if the law doesn't let you force them? They had no power before, they have no power now. It really is as simple as that. And if the tool doesn't exist, it can't TECHNICALLY be used on multiple people.

    I don't know how I can make it simpler. The police cannot force me to make a nuke. The law does not permit anyone to compel me to make a nuke. Therefore, there is no concern that this nuke I cannot make, can be stolen by Russian Cybercriminals to blow up Sydney. It is that level of ridiculousness now.

    Of course, you can say that a rogue police officer can bluff someone into doing this. But they could before too, and it's the fault of the company for not realising they'd been duped.

    Let's keep the argument simple. Can we get past this point or not?
     
  5. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,801
    Location:
    Brisbane
    I'll make you a deal: when I see details on how such a tool that allows access to only one unique, specific phone and prevents access to all others works, I'll be satisfied.

    Until then, your points are heard, understood and appreciated. But I'll remain a sceptic until something more concrete satisfies me.

    (And fair enough if that information is secret. In that case, I guess I'll just wait for it to leak.)

    (Also, apologies to everyone who came here to read about 5G. My bad.)
     
  6. JSmithDTV

    JSmithDTV Member

    Joined:
    Jun 13, 2018
    Messages:
    9,574
    Location:
    Algol, Perseus
    Just trust the gubbermint elvis, Dutton will protect us all... laws or protocol will never be broken and no methods will ever get into the wrong hands. :rolleyes:


    JSmith
     
  7. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    Nope, not secret. Here you go:

    ASIO Finds out WhatsApp is the new favourite tool for terrorism planning "9 out of 10 terrorist agree, WhatsApp is a cop beater!"
    ASIO Goes to WhatsApp's parent, first asks for technical information, to see what is possible.
    They decide the easiest way to do this, is to insert an additional library in the software, that detects when both parties in a conversation is the subject of a TCN they're about to ask for, keylog the keystrokes before encryption, and send a copy to a ASIO WhatsApp account. (Getting around some other limitations not relevant to this).

    So far, the capability they've asked for is FAR less capable than a free keylogger. No criminal would be interested in this tool without substantial modification, in which case, they might as well write their own, or buy one.

    Okay, now we've hit a problem. So far, everything they have asked for is perfectly legal. But how the hell do they get it to the surveillance subjects?

    1. If they try to compel WhatsApp into building this functionality into the next update for everyone, they can't, as this then becomes a systemic vulnerability, as the whole "class of technology" is affected, even if it's not activated for everyone.
    2. If they insist on a back door to selectively install this additional library. Again, illegal, that back door can be found by criminals and exploited.

    Fuck. You're right. They're stuck. This tool is effectively as useless as if they bought an Android keystroke logger from the dark web. :(

    No, not quite. They still have a few tricks up their sleeve:

    A. They now go to Google. They issue a TCN on Google. "You currently do not have the capability to selectively distribute software to a named person. We need that capability, and will pay you a fair price for it" . Just like police can compel telcos to install interception equipment in exchanges, this is the digital equivalent for software based communications. Does this request breach any condition under section 317ZG I mostly quoted on the last page?

    Is this request Systemic? No. It targets person by person.
    Does this request require breaking of encryption? No
    Does this weaken authentication or encryption? No
    Does this create a material risk to innocent third parties? Okay, there's a tiny crack that someone might be able to argue - What's to stop a Google Employee from using this to insert malware into their ex wife/girlfriend/favourite celebrity's phone. Nothing more that already stops them using existing capabilities to achieve the same goals in a different way. The risk needs to be material, not just theoretical, or remotely possible, and this is why companies are by law required to be advised that if they have a right to review by a technical expert and a retired judge, and they have avenues of appeal. The interception agency may insist that the changes are made by their own software only, and Google Employees don't have access to that function, or it might need to be "co-authorised", by one Google Employee, and one Interception Agency. Whatever they negotiate out, and the technical expert and retired judge agrees removes the risk below the material level, and is proportionate to the risk the interception agency is trying to mitigate.

    But let's say you have a hard arsed retired judge that won't accept any level of risk, that's fine. The laws are still not useless to these agencies:

    B. Alternatively, they already know a vulnerability they're not telling anyone about. They use that vulnerability to install it. Remember, this was legal before the AA Bill, still legal after, legal to law enforcement, legal to private citizens, and you even criminals won't get arrested for writing an exploit tool (Unless they also use it on a computer system to which they have no authority). And police have so many existing powers that make trivial system weaknesses usable to them, that aren't useable to criminals. Want to install malware? Find a site they use that doesn't use cert pinning, do a HTTPS downgrade, and modify the content to your hearts content. Criminals can't easily do that.

    C. They have compromised an associate, and use their account to social engineer in the malware. Their associate sends an email with "Hey, check out this new Sermon from Shiek Al Dodgy Brother", police intercept, modify the link, and it insists the recipient "install a new version of this app before you can view this video". (That would obviously still need some cooperation from Google, but possibly less so, not thought deeply about the technical mechanisms of this approach).

    The laws have been well thought out. So much of the early criticisms, such as not being able to tell your boss because of secrecy provisions, or worry about the broadness of the initial wording allowing "defacto backdoors" through a generous interpretation of the powers and limitations have been explicitly worded out in the final version. Your concerns may have had validity in the first draft, but not in the final.

    After an intelligent conversation with Elvis, I'm not sure I should stoop so low to respond to that, especially since I already have:

    At least now we know the intelligence of some of the posters who "contribute" to the debate.
     
  8. JSmithDTV

    JSmithDTV Member

    Joined:
    Jun 13, 2018
    Messages:
    9,574
    Location:
    Algol, Perseus
    You already have stooped lower by posting this.

    At least there was this admission in there, that's something.


    JSmith
     
  9. Doc-of-FC

    Doc-of-FC Member

    Joined:
    Aug 30, 2001
    Messages:
    3,396
    Location:
    Canberra
    I don't think so, you're pretty far down there by most people's standards of decent intellectual conversation.
     
  10. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    Meh, it was almost midnight after a 5am start. I don't normally get that narky, but yeah, that argument of "Don't allow this law, because not everyone will obey it" is such a gaping logical fallacy, I'm surprised it passes anyone's verbal filter.

    I've been waiting for someone to say "Ah, but having ANY rights at all, makes the bluff believable, therefore makes the job of someone who intended the break the law much easier", which at least brings some logic to it... But alas, nobody who has used that excuse seems to be able to think that far.

    But I guess the quick and ready counter to that, is maybe we should stop collecting tax then, in case having someone with the authority to demand tax payments lets criminals do this: https://www.news.com.au/finance/mon...g/news-story/5c6cfe703026501d47c0ebf19b0d23c3
     
  11. JSmithDTV

    JSmithDTV Member

    Joined:
    Jun 13, 2018
    Messages:
    9,574
    Location:
    Algol, Perseus
    Seems you may have taken that spot now. :cool:

    It's interesting when some people disagree with another that the intellectual high-ground card is pulled... tends to operate in reverse from my experience, however YMMV.

    Fair enough, apology accepted. I also appreciated the clearer explanation you gave earlier of how certain requests may operate.

    Surely though you at least understand the general point of view of those who object to these new laws and what sort of world they may bring, Orwellian comes to mind. This is a slippery slope in mine and many others opinions. I hope the right balance is struck at one stage, however I still feel it is skewed at this stage.


    JSmith
     
  12. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    Yes and no. All this is doing is pretty much giving back the same capability that law enforcement had before technology defeated it.

    But the use of technology has gotten more invasive, so creating a "matching capability", actually is now more invasive. People previously didn't send 180 character letters to people they were conspiring with. They do now, and they do so with regularity. So the digital equivalent of opening letters and inspecting the content, is not more invasive now because we can keystroke log apps - but because apps are so convenient, people say almost anything online, and chatter about nothing, which is more evidence either to collect for prosecution, or inspect and discard.

    Irrespective, what price for safety? Some will accept bars on their windows to prevent a break in, even though it ruins the view, and potentially stops them climbing out in a fire. I live in an area where I left the front door open for 2 weeks, and my home theatre system - visible from the street, was still there when I got back. I won't pay that price for window bars. Perhaps if you lived in Mount Druitt, that's a pretty low price to pay. In the same way, Perhaps my view is somewhat skewed, because I see the crimes committed every day. Where as you're thinking our last terror attack was years ago, and it was lone wolf anyway. (Which would also be skewed, because proof the powers were working is not the number of attacks that happened. It's the number of attacks that were stopped).
     
  13. JSmithDTV

    JSmithDTV Member

    Joined:
    Jun 13, 2018
    Messages:
    9,574
    Location:
    Algol, Perseus
    Fair point, however how can anyone independently verify this when these things are classed as operational matters and the organisations are exempt from freedom of information laws? I think there needs to be more transparency and that in turn will assist in people to have more trust in the information, rather than just believing Dutton on the news.


    JSmith
     
  14. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,801
    Location:
    Brisbane
    Always an interesting question. More so in the modern world where the words you utter in private today might be fine within current social context, but they can be brought up in 20 years and used to ruin your career when the social context changes.

    FWIW, I don't know what the answer is. But I do know that the "I have nothing to hide" statement can change quickly when different governments come in to power.

    I wonder what a generation of kids brought up on Twitter/Reddit will look like once they're all old enough to enter politics, and all their 14 year old shit-talking comes back to bite them in the arse? The only difference between this generation and mine was that my generation didn't have the technology to permanently record every stupid mistake they made.
     
  15. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    Thankfully nothing in the Metadata laws or the AA bill will allow storage over this time frame. It's private sector that you need to worry about there.
     
  16. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,801
    Location:
    Brisbane
    Yes, very true. At this point archive.org's Wayback Machine and Twitter are the greatest risks to any young person with political aspirations.
     
  17. Doc-of-FC

    Doc-of-FC Member

    Joined:
    Aug 30, 2001
    Messages:
    3,396
    Location:
    Canberra
    So you have proven that you can remain impartial, how about you go back to my earlier posts and respond to them with the same level of attention to the facts at hand without coming off as a dick with a response such as:
    when I put forward an independent audit of huawei firmware from July 2019.
     
  18. JSmithDTV

    JSmithDTV Member

    Joined:
    Jun 13, 2018
    Messages:
    9,574
    Location:
    Algol, Perseus
    Where are the same audits of all other major vendors?


    JSmith
     
  19. BAK

    BAK RIP

    Joined:
    Jan 7, 2005
    Messages:
    1,267
    Location:
    Ringwood VIC
    I can't see Google(/Apple) simply complying quietly with such a request. It would undermine trust in their platform. So they make it known they've developed this capability per law enforcement request, and people now start publishing "clean" APK files for said apps on various "underground" sites. Or, the "targets" start using disposable phones, changing regularly enough that targeted delivery is rendered moot. That's also not to mention that if it gets out that App Stores are allowing this kind of selective delivery, brands that are known for their secure/privacy features are more likely to yank their apps and publish them in a different way.

    Because if Australia requires this feature to track terrorists, it might be palatable. But what about when Saudi Arabia requires this feature to vet wives messages to ensure they aren't adulterers?
     
    JSmithDTV likes this.
  20. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    4,508
    Apple have already demonstrated a commitment to frustrating law enforcement, so I suspect you're right. But at least it's a few hundred thousand extra in fines that the government can use to look for, or develop another capability they can use, every time Apple refuse.

    Just like the bosses of tradies write off the $90 parking fines for parking on the footpath, to save their tradies the effort of lugging tools from the closest car park, Apple and Google will just call this the cost of doing business in Australia?
     

Share This Page

Advertisement: