Been hearing on the radio all kinds of Comcast ads like “we’ve raised our internet speeds for free!” I knew there was something else at play.
I’d argue that the main driver for all of this is the increased rollout of fiber. Companies like AT&T started broadly rolling out gigabit plans for what people were paying for sub 50 megabit cable plans. And the lines handled neighborhood network congestion better.
Comcast has to figure out how to be competitive, or they are going to get their asses handed to them.
AT&T and Comcast are both terrible companies with horrible customer service, but fiber is always going to be better than copper.
I was hoping Starlink would also push traditional providers to be better, but it hasn’t gotten its shit together well enough yet to be a real threat. Should have known better than to think Musk would be helpful.
Except fragility
If only wire shielding and stress relief was a thing that existed…
My internet has been much more reliable since I moved to fiber.
deleted by creator
deleted by creator
Nah, they’ve been doing that for years. In the two years since I first got my service at my house I went from 200gig to 800gig with no price increase. It’s p SOP these days when network upgrades take place in your area.
800gig? Can I come live with you?
500kbs upload and 1gb cap -comcast
Wait a second? Do they mean 800gig cap? That isn’t going to work. How can you only use 1gb a month? I use than per hour. Allegedly.
If not, they definitely meant 200mb upgraded to 800mb per second lol, if Comcast had 800gb/s internet they would be competing with employee access to direct private fiber peering between the CIA and NSA lol
If you use a gig per hour you’re good at 800gb.
I don’t like playing that close to overages.
Fuck that, instead of making them increase their imaginary “up to” numbers, make them advertise contractually guaranteed minimums. Id rather have a 25 mb minimum over a 100 mb maximum that usually sits around 8 mb.
When I bought internet services and colocated with major carriers every contract came with a Quality of Service rider that stipulated guaranteed quality and quantity of service. If my metrics fell below those minimums I had recourse. But, I could not extend that to my customers because they were using a shared resource I was providing. In general, though, I agree that there should be a QOS with every user connection.
and 20Mbps for upload
What we actually care about.
100mbps symmetric should be minimum standard. 100mbps down with 10mbps up is worse than remote islands with mud huts. Seriously, I was on a Pacific island that looked like what an after hurricane photo op does, and they had direct access to the fiber cables. So gigabit symmetric internet ONTs glued to the side of huts for a few bucks a month.
100Mb/s is still pretty abysmal.
A 4x increase for download and a 7x increase requirment for upload.
That’s a pretty solid improvement, honestly. They also have plans on whne to increase it to 1Gbps down/500Mbps up, so it seems like they are taking it seriously.
It’s long overdue and gigabit should be standard
It is long overdue, as the last update was 2015, when a democrat was President. The GOP refused to do it, and it took some time to seat a new FCC head due to Republican obstruction.
Gigabyte is coming, just not yet. This is a fine incremental step.
We should’ve had it when we paid for it, instead of telecom execs pocketing the money.
deleted by creator
Gigabit
my third world country’s internet has a minimum of 100mbps on most internet plans in the cities.
100mbps in the supposed best country in the world is shit, no matter how higher it is than 2003 standards.
lol I’ve never had anything over 12Mb/s. Currently have 8Mb/s, which costs roughly half than what I use to pay for 500kb/s
I would love to have 100Mb/s. Hell even half that.
Satellite?
DSL
I’m so sorry.
It’s interesting. I have a remote place (not where I live) in the least populated, podunkest county in the state (which is saying something). And we were still able to get fibre and 50Mbps out there (and it could be higher, but not really worth the extra money since it’s rarely used).
Still within a couple hours of a big city, though. Guessing you’re further away than that, or something?
The 500kbps was 15 minutes outside of a metro area of 2.5 million lol
It was decades of CenturyLink making sure no one else moved in on their turf.
Where I’m at now the fiber is a couple of miles away and no cable, but 8Mbps feels lightning fast after CenturyLink lol
That’s enough to watch exactly one 1080p 30fps stream on YouTube and literally nothing else.
That’s why I stream 720p when I can lol
100Mb/s is 800Mbps. This is 25Mbps to 100Mbps so 3.125mb/s to 8.33mb/s
Mbps = Mb/s = Megabits per second.
MBps = MB/s = Megabytes per second.
The p is just the /. It’s the capital or lowercase B that makes the difference.
Shit I found the one person who can actually remember the written difference bit and byte
As a computer engineer, I had better know. And don’t get me started on MiB vs MB
Please do I’d like to know more! ;)
kB = kilobytes = 1000 bytes
MB = megabytes = 1000 kB
kiB = kibibytes = 1024 bytes
MiB = mibibytes = 1024 kiB
Generally on hard drive/ssd capacity it will be listed in GiB (Gibibytes). This is the reason a 1 Terabyte drive is actually something like 931 GB showing in your system. Because your system uses GiB and the manufacturer uses GB.
1GB = 1,000,000,000 bytes
1GiB = 1,073,741,824 bytes
1 GB =~ 0.931 GiB
Edit: I had it backwards, it is fixed now
You messed it up, actually - it’s the bi units that are 1024
I hope ya know I was just messing with ya hahaha 🤣
- 3.125MB/s to 12.5MB/s
He is right though on megabits to megabytes. Internet speed is advertised in bits/s where files and transfer speeds are usually shown in software as megabytes/s
Cool, now make them use bytes as the system of measurement and we’ll be on to something.
I fear that will only happen when storage manufacturers are forced to use 1024 bytes per KB like everyone else.
In fairness it’s a very longstanding tradition that serial transfer devices measure the speed in bits per second rather than bytes. Bytes used to be variable size, although we settled on eight a long time ago.
1024 bytes per KB
Technically, it’s 1000 bytes per KB and 1024 bytes per KiB. Hard drive manufacturers are simply using a different unit.
Base 10 is correct and more understandable by humans. Everyone uses it except Windows and old tools. macOS, Android (AOSP), etc.
Found the hard drive manufacturer.
It’s 1024. It’s always been 1024. It’ll always be 1024.
Unless fo course we should start using 17.2GB RAM sticks.
There’s a conflict between the linguistic and practical implications here.
“kilo-“ means 1,000 everywhere. 1,000 is literally the definition of “kilo-“. In theory, it’s a good thing we created “kibi-“ to mean 2^10 (1024).
Why does everyone expect a kilobyte to be 1024 bytes, then? Because “kibi-“ didn’t exist yet, and some dumb fucking IBM(?) engineers decided that 1,000 was close enough to 1,024 and called it a day. That legacy carries over to today, where most people expect “kilo-“ to mean 1024 within the context of computing.
Since product terminology should generally match what the end-user expects it to mean, perhaps we should redefine “kilobyte” to mean 1024 bytes. That runs into another problem, though: if we change it now, when you look at a 512GB SSD, you’ll have to ask, “512 old gigabytes or 512 new gigabytes?”, arguably creating even more of a mess than we already have. That problem is why “kibi-“ was invented in the first place.
It’s not just the difference between kilo- and kibi-. It’s also the difference between bits and bytes. A kilobit is only 125 eight-bit bytes, whereas a kilobyte is 8,000 bits.
Computers run on binary, base 2. 1000 vs 1024, one is byte aligned(2^10), the other is not.
Thats an irrelevant technical detail for modern storage. We regularly use billions, trillions of bytes. The world has mostly standardized on base 10 for large numbers as it’s easy to understand and convert.
Literally all of the devices I own use this.
Altice (Optimum) took this opportunity to cut upload speeds from 35mbps to 20 under the guise of the “free upgrade”. You want your old upload speeds back? Oh that’s their most expensive tier now.
I’m dropping them, it was too unreliable for work from home. I pay twice as much now for fios
The “upgrade” they’re speaking of is to the cars of all the executives?
Same for my “XFinity” (Comcast) service. Literally the only plan with more than 20 up is the most expensive tier with 1200/35. Sadly, it has been that way for several years… but this year they had no choice but to jack up all rates across the board so the most expensive tier is now $30 more expensive ($90 -> $120). No other competition so… that’s that.
I care more for stability and low latency, not so much speed.
Offering me a faster cellular or satellite connections doesn’t interest me.
There are features of IPv6 that would help there. I actually think pushing that to be rolled out widely is more important than 1Gbps connections.
What about cable
He’s a mid Deadpool villain
I’m a Booster Gold man myself.
Not an option for everyone
100Mbps is still very slow. Much better than 25Mbps, but still slow.
I have symmetric 1Gbps and do a LOT of data transfer (compared to 99.99% of people). And even then I rarely really would need or even notice more than 100Mbps.
For most people, in the real world, why is 100Mbps “very slow”?
Because downloading large files takes hours instead of minutes
The vast majority of people are not downloading multi GB files frequently
I use to think that until I spent a bit of time with a gamer. 75Gig updates etc… the fuck is in those game ! The whole Netflix library?
So Games, 4K videos etc…
This isn’t really true. An HD movie on Netflix/Hulu/Prime/etc is multi GB. It just doesn’t need to download fast, because anything faster than the bitrate of the movie won’t be perceptible.
But there are also games on platforms like Steam, Epic, PlayStation, etc. These are often very large.
For context, a 4K Blu-ray disc has a maximum transfer rate of 144 Mbps. Most streaming services are compressing much more heavily than that. Closer to 20 or 40 Mbps, depending on the service. They tend to be limited by managing datacenter bandwidth, not end user connections.
While I get that people hate having to download big games over 100Mbps, it’s something you do once and then play the game for weeks.
So build the capability and people will use it when they need it. My point still stands that 100Mbps is slow, even if most people are fine with it day to day.
Also, for a family of four, that would mean only 2 of them would be able to watch a 40Mbps HD stream at once. I get that that is relatively rare for 3 people in a family to want to stream at that speed at the same time, but I wouldn’t call something fast if it can’t support even that.
(YouTube recommends a bitrate of 68Mbps for 4K 60fps content and 45Mbps for 4K 30fps. Higher when using HDR.)
Where I’m going with this is that there are much more important things than going significantly over 100Mbps. Quality of service, latency, jumbo MTU sizes, and IPv6 will affect you in many more practical ways. The bandwidth number tends to be used as a proxy (consciously or not) for overall quality issues, but it’s not a very good one. That’s how we’ll end up with 1Gbps connections that will lag worse than a 10Mbps connection from 2003.
Just updates running in the background use an enormous amount, let alone full game downloads.
Twitch and Youtube use a decent amount per hour as well.
Tell that to my torrent box 😎
You are not the majority of people
A file large enough to take hours, plural, at 100Mbps is more than 90GB. Doing that regularly is definitely not normal usage.
Tell that to everyone who has played a Rockstar game.
Average 4K BDRIP movie is 60 GB, average AAA game is 60-100+ GB. So you are saying that watching movie once a week and downloading one game is not normal? Using 1 GBit Internet means saving 3-6 hours of time per week.
Watching movies and playing AAA games is normal, sure.
Downloading 4K BDRIPs and a new AAA game every week definitely isn’t. Most people probably stream their movies, and even those prone to pirating their content are likely downloading re-encoded copies, not full sized BDRIPs.
On top of that, it’s not like you have to sit there and wait for it. You’re only really saving that time if it’s time you were going to spend sitting and staring at your download progress instead of doing something else.
I’m not saying edge cases don’t exist where someone would notice a real difference in having >100Mbps, but it’s just that, an edge case.
Most of the time, the idea to watch a film comes to me quite suddenly, so I have to wait until the film is at least partially downloaded before I start watching it. And even downloading an app from the repository takes 10 times less time. And 1000 MBps internet is only 5-10 euros more expensive than 100 MBps.
Because downloading GTA V takes 2 1/2 hours at 100Mbps, and 14 minutes at 1Gbps.
It’s amazing how much our views change with time. My dad was definitely a super early adopter of cable when it became available in our area, if I recall it was 16 Mbps which was unreal to me in 2002. I made do with 5 Mbps in uni and it was totally usable.
But now, I’ve had 1Gbps for years and wow it’s so different, changes your habits too. I don’t hoard installed games as much, I can pull them down in minutes so why keep something installed if I’m not going to use it?
I remember thinking, “How am I ever going to fill this 100MB hard drive? That’s so much space!” That was some time around 1997, I think.
*cries in Australian*
My parents pay like 40 dollars per month for 1Mb down and like .2 Mb up
Shit, that should legitimately be illegal.
Do they also have to feed the pigeons carrying the data packets?
I’d like to see a big government push to provide municipal services in every single metro area and extend it by whatever means into rural communities.
Xfinity keeps raising rates, I’m paying more now for just internet than the cost of basic cable, internet + digital voice was back in the 00s. While around 800 down, it’s still only about 40 something up, and has been like that for years and years.
I think we desperately need competition and if the government were to provide it, that’d be just fine.
Aww, that’s cute.
-posted from my 768k $80/mo broadband.
But only 20 ̶d̶o̶w̶n̶ up !!
!! :-(
It really does suck, where I live the base plan gives you 300mbps down (which I know is pretty fast) but you are limited to 10mbps up. As much as they tout their speeds you’ll only get them if you pay top dollar.
Sounds like Spectrum where I live, on the bright side our 300 down is usually closer to 350 down, but also their 10 up is usually closer to 8. Meanwhile you have to dig to find the upload speeds when you sign up, even though they have the download speeds plastered everywhere. Honestly, there should probably be a rule that ISPs can’t list download speeds without upload speeds right next to it.
Yeah it is spectrum, the company is quite irritating and yeah they should be required to show both up and down speeds next to each other. For awhile I had t-mobile internet but the speeds were too inconstant so back to spectrum it was.
up*
Here I am getting 5KB/s in California 😏
Where are you? I’ve lived in California my whole life and have had faster speeds than that since 1998.
I was kidding. I get 900+ Mbps on my phone while I only get about 400 max on my desktop at home. I live north of San Diego