The regulation is already there, mandating the telco routes 000 calls.
They failed and as a result people died.
What the parent poster probably means by rolling heads is, this should not just be a fine to the telco but literally people going to jail for the criminal negligence.
How else is there going to be change?
A money fine is just an operational expense that can be offset and "part of business if someone dies because of bad testing".
Optus has had three catastrophic incidents in as many years, there is a clear failure of management and rolling those heads would make room for people not keen on repeating history.
> there is a clear failure of management and rolling those heads
It's more a structural issue.
Whilst it is a large chunk of assets, there's lots of competition so not really exciting. AUD has weakened, which makes it worth less by default.
So if you were its owner (in Singapore) what would you do? You can't really sell it (not worth it), you can't really invest in it but you don't want to fold it.
That's different, it's VOIP and it's part of your contract. Arrangements can be made to go around this.
But what happened here was, 000 calls that should have worked didn't, resulting in 4 linked death so far.
Having worked in that field a few years ago, I know that any minute in which 000 is inaccessible is a grave disaster. This was a colossal cluster f: 14 hours!
I've done some work for a telco and I was surprised to find that emergency calls are routed over completely different infrastructure to ordinary calls, and it is not routinely tested.
There wasn't an automated way to test it, and most people never thought at all about the emergency call routing because it was such a low number of calls (I think single digits ever).
It's easy to see how you could accidentally break emergency calling and not notice.
Not sure which telco that is - but in the UK, impact on emergency calls is taken into account for every change that happens. This was non-negotiable in the 15 years I spent at a telco.
Friendly reminder to anyone who installs or maintains PABXes: test your emergency calling whenever you make change.
In Australia, you can call 000, say you’re testing a phone system, read out the Caller ID you’re supposed to be calling from, and they’ll confirm the number and location. This happens with the 000 operator, not the police/fire/ambulance operator you get transferred to in a real emergency.
Other countries may have different testing procedures.
In the U.K. you should email 999testcalls@bt.com first, although strictly speaking for a one off test (typically by an end user rather than a professional) it’s ok to just call and explain.
So let me get this straight. After a data breach and a massive outage, their first move is to hint that a few employees are to blame for this tragedy? It's a classic playbook move to find a scapegoat.
Yes, a similar thing happened for the major outage prior to this. The same thing is happening for the many data breaches that are occurring. It is never the decision maker's fault, always some poor employee that doesn't get a chance to present their case.
I was in similar meetings where such decisions were made and possible consequences were brushed off with "we just need to get this done as quickly as possible".
This won't stop until there are serious consequences for the businesses.
This is just another major crisis for Optus. It no longer has the technical capacity to operate a telecommunications network and its managerial class either doesn't know or doesn't care. As a corporation, Optus no longer serves any purpose and ought to be wound up.
That's only one side of it. Think about how much money they saved by offshoring and having """skilled""" migrants from India handling our technical infrastructure now.
Apparently one was an 8-week old. Heads need to roll here but I have no faith in anything meaningful coming of this.
What would rolling heads actually achieve here? IMHO - there “just” needs to be stronger regulation that ensures carriers plan and account for this.
The regulation is already there, mandating the telco routes 000 calls. They failed and as a result people died.
What the parent poster probably means by rolling heads is, this should not just be a fine to the telco but literally people going to jail for the criminal negligence.
How else is there going to be change? A money fine is just an operational expense that can be offset and "part of business if someone dies because of bad testing".
Optus has had three catastrophic incidents in as many years, there is a clear failure of management and rolling those heads would make room for people not keen on repeating history.
> there is a clear failure of management and rolling those heads
It's more a structural issue.
Whilst it is a large chunk of assets, there's lots of competition so not really exciting. AUD has weakened, which makes it worth less by default.
So if you were its owner (in Singapore) what would you do? You can't really sell it (not worth it), you can't really invest in it but you don't want to fold it.
The elephant in the room is that Australian landlines stop working whenever there is an NBN/Internet outage, or the power goes off. No 000 for you.
That's different, it's VOIP and it's part of your contract. Arrangements can be made to go around this.
But what happened here was, 000 calls that should have worked didn't, resulting in 4 linked death so far.
Having worked in that field a few years ago, I know that any minute in which 000 is inaccessible is a grave disaster. This was a colossal cluster f: 14 hours!
Interesting, the junction boxes here in the USA are full of 12v batteries to ensure they remain working when power goes out.
And when I was renting at a place with just VOIP, the modem had a slot for a battery to ensure the telephone remained operational when power went out.
Do neither exist in AU?
I've done some work for a telco and I was surprised to find that emergency calls are routed over completely different infrastructure to ordinary calls, and it is not routinely tested.
There wasn't an automated way to test it, and most people never thought at all about the emergency call routing because it was such a low number of calls (I think single digits ever).
It's easy to see how you could accidentally break emergency calling and not notice.
Not sure which telco that is - but in the UK, impact on emergency calls is taken into account for every change that happens. This was non-negotiable in the 15 years I spent at a telco.
In the UK, based on the latest data, we get 35 million 999/112 calls per annum, roughly 96k per day.
But there's about 50,000 mobile phone towers, so still single digits per site.
Would be wonderful if we could crowd source regular testing. Could help catch device specific issues like those on the Pikcel line of phones.
Friendly reminder to anyone who installs or maintains PABXes: test your emergency calling whenever you make change.
In Australia, you can call 000, say you’re testing a phone system, read out the Caller ID you’re supposed to be calling from, and they’ll confirm the number and location. This happens with the 000 operator, not the police/fire/ambulance operator you get transferred to in a real emergency.
Other countries may have different testing procedures.
In the U.K. you should email 999testcalls@bt.com first, although strictly speaking for a one off test (typically by an end user rather than a professional) it’s ok to just call and explain.
https://www.ofcom.org.uk/phones-and-broadband/telecoms-infra...
So let me get this straight. After a data breach and a massive outage, their first move is to hint that a few employees are to blame for this tragedy? It's a classic playbook move to find a scapegoat.
Yes, a similar thing happened for the major outage prior to this. The same thing is happening for the many data breaches that are occurring. It is never the decision maker's fault, always some poor employee that doesn't get a chance to present their case.
I was in similar meetings where such decisions were made and possible consequences were brushed off with "we just need to get this done as quickly as possible".
This won't stop until there are serious consequences for the businesses.
A similar thing happened around 2 years ago that, from memory, affected the trains and 000(the Aussie 911 equivalent).
This is just another major crisis for Optus. It no longer has the technical capacity to operate a telecommunications network and its managerial class either doesn't know or doesn't care. As a corporation, Optus no longer serves any purpose and ought to be wound up.
> and its managerial class either doesn't know or doesn't care
They never had a say. Their parent Singtel were always effectively calling the shots.
That's only one side of it. Think about how much money they saved by offshoring and having """skilled""" migrants from India handling our technical infrastructure now.