|
||||||||
Vodafone Knackered??? |
![]() |
|
|
Thread Tools | Search this Thread |
|
|
#51 |
|
Forum Member
Join Date: Mar 2005
Posts: 3,644
|
Quote:
Was just network switches that were stolen, according to The Register.
Therefore the thieves not only got inside the building but also had time to unplug (or maybe just cut) a lot of cables but importantly unscrew the things from racks. Trust me with rack mount switches even if you come seriously well tooled up like the terrorists in the Die Hard movie or something, it's still going to take several minutes to take these things out. And while someone might be able to carry one in each hand, they wouldn't for much distance, and not really on a motorbike or scooter. To steal enough of them to make the theft worth bothering to do they have to have had a very nearby car or van. Dear me Vodafone. Either you have one hell of an insecure data centre by modern standards (normally you need to go through all sorts of cages and glass boxes to get in these places these days, and then multiple doors and cameras and access systems inside too), or some seriously defective security staff, or the robbers were helped - i.e. an inside job. HUGELY embarassing for them, and that's before you start to look at how badly their customer communications on the incident have been, and the apparent total lack of a disaster plan to switch such vital services to a different data centre. The incident itself may now be resolved (finally seems to be in Hammersmith) but the multiple failures it's exposed are shocking. I know a few data centres that have been broken into in the past, security guard held at knife point, tied up and the robbers getting away with hundreds of thousands of pounds worth of processors from big servers. I must admit that the service should not have been out that long as they should have had it backed up to another mirrored location/disaster recovery site. |
|
|
|
|
Please sign in or register to remove this advertisement.
|
|
|
#52 |
|
Forum Member
Join Date: Mar 2009
Posts: 14,546
|
Bear in mind though that this is a massive mobile network that should know better.
For example where I work we have redundant sites and a fault tolerant system. No single site should cause a service outage bar a single cell site. The sites we have at work are housed in secure dc's, caged, with 24/7 security and remotely monitored IP cameras and there is always a backup site for any critical equipment. The Vodafone building in the news article looked like just a random office building, not a highly secured datacentre. |
|
|
|
|
|
#53 |
|
Forum Member
Join Date: Oct 2004
Posts: 14,641
|
Quote:
That there are still issues with Vodafone after 36 hours raises serious questions about their disaster recovery. I can only think of only one other
network problem that lasted longer (Orange's data troubles). But Vodafone's outage is pretty shocking - how a small "attack" at a non-descript building on an industrial estate can take out a mobile network for most of southern England (Vodafone's PR waffle about it being confined to the "M4 corridor" is incorrect, as places as west as Devon and Cornwall were badly affected). What stinks more is the poor communication from Vodafone themselves - with an outage of this magnitude you'd have expected regular updates. All this from a network who recently ran an ad campaign saying how reliable their network is. |
|
|
|
|
|
#54 |
|
Inactive Member
Join Date: Jun 2010
Posts: 648
|
After turning my I-Phone 4 on I again suffered major issues with my phone including, not been abale to make or recieve calls in fact if you dialled my number it was a complete dead -line, also my dongle with Vodafone didn't work all this took over 24 hours. Then my I Phone was restarted several times and I switched off 3G, and my phone started working, still however my dongle didnt work.
After been told by Vodafone that all phones where now working perfectly, still mine wasnt neither was my dongle I was so angry also it didnt help I had several prospective tenants coming to view my property, as I couldnt get in touch with them many had a totaly wasted journey into Hammersmith. I decided to totally wipe my I-Phone and internet service from my laptop which took over 2 hours in total to sort out. All this worked and I was able to use my I-Phone and dongle properly, however non of the above advice was given to my by Vodafone. I am also extremely angry and frustrated at the lack of security, when i was told by Vodafone CS had copper wiring stolen - the whole thing was an absolute joke. |
|
|
|
|
|
#55 |
|
Guest
Join Date: Dec 2007
Posts: 2,070
|
One of the chaps at work drove round to be nosey.
Said it looks like a bulldozer has been driven right through the building |
|
|
|
|
|
#56 |
|
Forum Member
Join Date: Apr 2006
Posts: 2,188
|
Well still no working phone here, totally hacked off with the whole thing now as my friend and me ONLY have mobiles and both on vodofone.
|
|
|
|
|
|
#57 |
|
Forum Member
Join Date: Mar 2004
Location: Bristol
Posts: 269
|
Don't think you guys quite realise how a mobile network works. Unless you want your bills to at least double or triple you have to live with single points of failure. Each cell tower is connected to another one or back to the switch site via one single leased line or a microwave link. These cost millions of pounds a year to run. To have these doubled up for redundancy to separate sites would make the costs extortionate and would have to be reflected in your bills. There will be redundancy in the switch site itself connected back to the core network or around the building itself with multiple fibres and switches but this is no good if both the primary and the secondary connections were ripped out. All the cell towers for a single area go back to the one building and the building is probably also used to host some servers specific to certain customers. Like voicemail for 100,000 customers will be there, another 100,000 in a different building etc. For capacity reasons you probably can't move those customers to another server or you'll overload it and it probably involves lengthy restoration and network configuration.
To recover this their best and quickest option is to replace the switches that were stolen and reconnect the network up together. But again for cost reasons you can't have a duplicate of every single piece of hardware preconfigured with the right config. They would have had to get something out out of stores in another part of the country, or order quick sharp from a vendor, then configure it and install it etc. All this takes time. If you worked in the trade you would know it's not quite as easy as you would like to think! |
|
|
|
|
|
#58 |
|
Forum Member
Join Date: Sep 2006
Location: Barnehurst, Kent
Posts: 538
|
Lack of information from Vodafone
For me, I can except things like this could happen but what angers me more than anything is VF's lack of information on the issue.
|
|
|
|
|
|
#59 |
|
Forum Member
Join Date: Mar 2004
Location: Bristol
Posts: 269
|
Agreed. The technical aspects behind it is one thing. But there is no excuse for not keeping customers up to date. Lack of information makes customers really angry.
|
|
|
|
|
|
#60 |
|
Forum Member
Join Date: Sep 2003
Location: West London
Posts: 14,776
|
Quote:
Don't think you guys quite realise how a mobile network works. Unless you want your bills to at least double or triple you have to live with single points of failure. Each cell tower is connected to another one or back to the switch site via one single leased line or a microwave link. These cost millions of pounds a year to run. To have these doubled up for redundancy to separate sites would make the costs extortionate and would have to be reflected in your bills. There will be redundancy in the switch site itself connected back to the core network or around the building itself with multiple fibres and switches but this is no good if both the primary and the secondary connections were ripped out. All the cell towers for a single area go back to the one building and the building is probably also used to host some servers specific to certain customers. Like voicemail for 100,000 customers will be there, another 100,000 in a different building etc. For capacity reasons you probably can't move those customers to another server or you'll overload it and it probably involves lengthy restoration and network configuration.
To recover this their best and quickest option is to replace the switches that were stolen and reconnect the network up together. But again for cost reasons you can't have a duplicate of every single piece of hardware preconfigured with the right config. They would have had to get something out out of stores in another part of the country, or order quick sharp from a vendor, then configure it and install it etc. All this takes time. If you worked in the trade you would know it's not quite as easy as you would like to think! |
|
|
|
|
#61 |
|
Forum Member
Join Date: Sep 2008
Location: Bracknell
Posts: 4,894
|
my phones still not right after it. texts are now arriving 4 hours after i send them
|
|
|
|
|
|
#62 |
|
Forum Member
Join Date: Jul 2010
Posts: 24,103
|
I've seem some BT telephone exchanges, where, the master distribution frame - at which all the local loops start - is clearly visible through ground floor windows. To wreck the phone system, it would just take a lunatic with a can of aluminium spray paint ...........
|
|
|
|
|
|
#63 |
|
Forum Member
Join Date: Mar 2009
Posts: 14,546
|
Quote:
Don't think you guys quite realise how a mobile network works. Unless you want your bills to at least double or triple you have to live with single points of failure. Each cell tower is connected to another one or back to the switch site via one single leased line or a microwave link. These cost millions of pounds a year to run. To have these doubled up for redundancy to separate sites would make the costs extortionate and would have to be reflected in your bills. There will be redundancy in the switch site itself connected back to the core network or around the building itself with multiple fibres and switches but this is no good if both the primary and the secondary connections were ripped out. All the cell towers for a single area go back to the one building and the building is probably also used to host some servers specific to certain customers. Like voicemail for 100,000 customers will be there, another 100,000 in a different building etc. For capacity reasons you probably can't move those customers to another server or you'll overload it and it probably involves lengthy restoration and network configuration.
To recover this their best and quickest option is to replace the switches that were stolen and reconnect the network up together. But again for cost reasons you can't have a duplicate of every single piece of hardware preconfigured with the right config. They would have had to get something out out of stores in another part of the country, or order quick sharp from a vendor, then configure it and install it etc. All this takes time. If you worked in the trade you would know it's not quite as easy as you would like to think! If a network is properly designed you should be able to take out 1 piece and it shouldn't all come crashing down. Whilst I accept each site will only have a microwave link / chain they usually go back to very localised locations which are then routed via dual redundant links back to a load balanced system. From that point it becomes just like any other major data network, it should never have any big single points of failure. This wasn't just some little link of 40 cells feeding in to the network it was a large part of southern England that went offline. At that level you should have equipment with a high availability design and should the power fail in that building and the generator not work, a fire, break in etc another site should be able to cope (all be it with some reduced capacity). |
|
|
|
|
|
#64 |
|
Forum Member
Join Date: Mar 2009
Posts: 14,546
|
Actually, further to my post above MBNL (Three / T-Mobile) state in an ofcom response that they use both Microwave and lease line fibre at some sites for capacity. So if that's the case then surely the likes of Vodafone can afford a dual redundant system for an area the size of the M4 corridor, what was it? 300,000 customers the service was lost for? Quote:
(c) Another contrast with leased lines is that microwave links are limited by distance and depend on the kind of microwave frequency employed. While a leased line can cover a distance of 75km, a microwave link is usually used to cover a distance of around10km (this can be extended up to 20 km but this brings with it a degradation of quality). This is normally resolved by introducing additional repeater sites, or engineering existing sites as ‘hops’. The additional hops lead to additional capital investment. (d) Another factor that operators have to consider when deciding whether to employ microwave links is that, as the network evolves in complexity there is an increased risk of capacity bottlenecks, and a reduction in end-to end availability. In such cases, leased lines are often used to bypass the bottlenecks and the distances required might not be conducive to the use of microwave. (e) Although new technology has meant that some microwave links can cope with the very high bandwidth requirement, they are not comparable to the very high bandwidth leased line offerings in place for bandwidths beyond 155MBit/s |
|
|
|
|
|
#65 |
|
Forum Member
Join Date: Sep 2006
Location: Barnehurst, Kent
Posts: 538
|
Well my Vodaphone coverage has been down in my area DA8 for exactly a week now. Voda have said that this is due the a site being down in your area but they have no idea when it will be restored. They have given me £40 credit on my next bill, which is fine but still a major pain.
|
|
|
|
|
|
#66 |
|
Forum Member
Join Date: Aug 2003
Location: Reading
Posts: 2,618
|
Quote:
Well my Vodaphone coverage has been down in my area DA8 for exactly a week now. Voda have said that this is due the a site being down in your area but they have no idea when it will be restored. They have given me £40 credit on my next bill, which is fine but still a major pain.
![]() You know you can divert your inbound calls to another number? You could also see about getting a Vodafone Sure Signal - You'd then have coverage at home. This may not be the best suggestion... It does hurt to know you're paying Vodafone for their network downfalls! |
|
|
|
|
|
#67 |
|
Forum Member
Join Date: Nov 2003
Location: Nailsworth, Gloucestershire
Posts: 10,402
|
Quote:
I guess it depends on what type of switches they were as some of them can be tens of thousands of pounds each (although they are heavy and they would come prepared with the correct equipment to remove them from the racks/get them outside as they would be hard to lift even with 2 people).
Quote:
If you worked in the trade you would know it's not quite as easy as you would like to think!
Quote:
If a network is properly designed you should be able to take out 1 piece and it shouldn't all come crashing down.
Whilst I accept each site will only have a microwave link / chain they usually go back to very localised locations which are then routed via dual redundant links back to a load balanced system. From that point it becomes just like any other major data network, it should never have any big single points of failure. This wasn't just some little link of 40 cells feeding in to the network it was a large part of southern England that went offline. At that level you should have equipment with a high availability design and should the power fail in that building and the generator not work, a fire, break in etc another site should be able to cope (all be it with some reduced capacity). A total equipment outage on any telephony equipment these days is very rare, therefore network resiliancy is based on partial failure, no network operator would have a resiliancy based on hardware being stolen. |
|
|
|
|
|
#68 |
|
Forum Member
Join Date: Mar 2009
Posts: 14,546
|
Quote:
In terms of telephony equipment, try hundreds, if not millions of pounds.....
As someone who has worked in the telecomms industry for over 20 years I have to agree. As xtaz says, building in resliancy to be able to cope with every eventuality within a telephony network would cost millions of pounds, therefore for that to happen the costs to every user would be double, triple, or even quadruple what they pay now. how many people would be prepared to pay that? A total equipment outage on any telephony equipment these days is very rare, therefore network resiliancy is based on partial failure, no network operator would have a resiliancy based on hardware being stolen. Core switches that cause a failure to hundreds of thousands of customers should not be the only ones. That shouldn't cause a failure. Whilst I agree that everything can't be duplicated, core infrastructure like this away from the cell tower infrastructure shouldn't have single failure points. |
|
|
|
|
|
#69 |
|
Forum Member
Join Date: Nov 2003
Location: Nailsworth, Gloucestershire
Posts: 10,402
|
Quote:
Sorry but I can't agree, if there had been a power failure at that site it would have had the same effect as the equipment being stolen from one site.
Core switches that cause a failure to hundreds of thousands of customers should not be the only ones. That shouldn't cause a failure. Whilst I agree that everything can't be duplicated, core infrastructure like this away from the cell tower infrastructure shouldn't have single failure points. If there had been a simple system failure then that wouldn't have caused "a failure to hundreds of thousands of customers". All equipment has built-in resilliancy, from the power supplies, the Main Controller cards, the Switch fabrics, out-going line cards, etc. So in the event of failure of the active card, the stand-by card takes over. Core switches are designed to maintain switching capacity with as much as 70% failure of the switching fabric so, again, are very resilliant to failure. However there wasn't a simple failure at Basingstoke, equipment was physically removed. It isn't simple to switch traffic from one site to another. All switches work very close to capacity, to maximise the efficiency of the network, therefore if one element in that network is physically removed it is very, very, difficult, if not impossible to re-allocate that traffic elsewhere, there simply isn't the capacity to do so. Even a company like BT would have similar problems. A few years ago I have to change a sub-rack on a piece of equipment for BT. The sub-rack supported four 10 Gbit/s line cards so, before the rack could be changed out, BT had to off-load the traffic on to alternate systems. It took them four weeks to find sufficient capacity elsewhere to take the traffic off the sub-rack before I could change it out. So if it takes a company the size of BT four weeks to find enough spare capacity for a planned outage, do you seriously believe a company like Vodafone would be able to do something similar in a few minutes, with the added problem of stolen equipment? |
|
|
|
|
|
#70 |
|
Forum Member
Join Date: Sep 2003
Location: West London
Posts: 14,776
|
So what would they do if the place totally burnt down one night, or (as is a common scenario in many disaster planning exercises, whether or not it is likely to actually happen in Basingstoke) some sort of dirty bomb prevented access to the place for months. Either way making simple replacement with new switches impossible. What would they do then? Just leave West London with no service for months? Their share price would go through the floor.
They have to have made a plan for these sorts of really serious disaster scenarios (surely? Unless they're a REALLY badly run company) and simple theft of the equipment should have just been one of these scenarios. |
|
|
|
|
#71 |
|
Forum Member
Join Date: Nov 2003
Location: Nailsworth, Gloucestershire
Posts: 10,402
|
Quote:
So what would they do if the place totally burnt down one night, or (as is a common scenario in many disaster planning exercises, whether or not it is likely to actually happen in Basingstoke) some sort of dirty bomb prevented access to the place for months. Either way making simple replacement with new switches impossible. What would they do then? Just leave West London with no service for months? Their share price would go through the floor.
They have to have made a plan for these sorts of really serious disaster scenarios (surely? Unless they're a REALLY badly run company) and simple theft of the equipment should have just been one of these scenarios. As regards disaster recovery from something like a fire, look at the problems caused by the fire at BTs North Paddington exchange last year as an example... Burne House Burns. The fire was caused as a result of flooding, but it shows the seriousness of such an event All exchanges have very sophisticated fire monitoring and extinguishing systems these days, so an exchange is unlikely to burn down. However the impact would be severe simply because of the massive concentration of services through these places. In the case of a total loss, companies have mobile exchanges in trucks that can be deployed to get basic services up and running, but to replaces the thousands of miles of cable and fibre in a total loss would take months. |
|
|
|
|
|
#72 |
|
Forum Member
Join Date: Mar 2007
Posts: 5,444
|
Quote:
All exchanges have very sophisticated fire monitoring and extinguishing systems these days
|
|
|
|
|
|
#73 |
|
Forum Member
Join Date: Feb 2010
Location: Luton
Posts: 1,752
|
Couldn't find a single thing on this earlier! Trying to install a machine using GPRS today and nowhere on the Vodafone website did it indicate there was a problem! Useless status updates.
There are problems in Belfast. At least from Bangor to Ballyclare which was tested........ I mean is it THAT hard to tell us that there is an issue? Why isn't there even a box on the front page of the vodafone.co.uk website? Or a banner? Or anything? Even in the status page? At least tell us so we know and aren't doing work using your technology which we paid for and don't look like a complete moron when our customers tell us OUR supplied brand new equipment doesn't work. It's understandable to an extent to expect us to call, but if you know something, tell us. This was the best I found this morning to indicate of there was a problem... and it doesn't indicate it! Seriously? Not impressed.
|
|
|
|
|
|
#74 |
|
Forum Member
Join Date: Nov 2003
Location: Nailsworth, Gloucestershire
Posts: 10,402
|
|
|
|
|
|
|
#75 |
|
Forum Member
Join Date: Sep 2006
Location: Barnehurst, Kent
Posts: 538
|
Day 11 of no Vodafone coverage in the DA8 area.
So now we're on Day 11 with no coverage in the DA8 area. Vodafone have realise there is still a site down, but still no idea as to when it would be restored. Have to admit this is quite amusing.
|
|
|
|
![]() |
| Thread Tools | Search this Thread |
|
All times are GMT. The time now is 18:04.





