this post was submitted on 17 Apr 2024
72 points (100.0% liked)
Asklemmy
1454 readers
58 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
When you consider the time as a number line, years are not points at integers (which would in some way warrant a year 0), but rather periods between them. Year 1 is the period between 0 and 1, and before that was -1 to 0, or year -1. There is no year 0, because there isn't anything between 0 and 0
That makes sense, but trying to square that off with the idea that the year 2000 is the start of the 21st century is hurting my head.
If year 1 is the 1st year, then surely the first year of the 21st century should be 2001?
It should all be zero indexed. Positional number systems like we write with are (600=0600) but our language isn't, which causes this problem. Basically, if 2004 is the 20th century the gospels took place in the 0th.
It is. The system is confusing.
That may be because it is not. The first century was years 1 to 100. The second was 101 to 200. The 21st is therefore 2001 to 2100.
What you're probably referring to is the "cultural century" which was considered to have started when the lead digit changed from 1 to 2. The same thing happened quite recently when some people argued 2020 was the start of a new decade (again, it wasn't)
I hate it when people say it wasnt the start of a new decade, it's a shit argument, why does it matter what the first year was, 2014 - 2024 is also a decade, and 2pm aest September 22nd 2024 will also be the start of a new decade. There is nothing wrong with saying 2020 was the start of a new decade. (again, it was)
There are decades and there are decades. Just like there are weeks (period between Monday and Sunday inclusive) and weeks (any seven consecutive days).
When you say "I'll do this next week", then you mean the next period between Monday and Sunday. When you say you'll do it in a week, it means you'll do it after exactly 7 days from now, regardless of what day is it today. Same for decades.
You said start of a new decade, not the new decade in the original comment, they are very different
???? You know very well what I meant, be more forgiving to second-language speakers
when I originally replied to you, I very obviously did not know that you meant that
didn't know you weren't a native speaker either
This explanation is unclear to me. Why do we choose the later of the two endpoints of the year for (0, 1) but the earlier of the two for (-1, 0)?
The language is rooted in the same logic as people. Your first year was between the ages of 0 and 1. The first year before you were born is between -1 and 0. There is no 0th year because 0 is a point in time and not a range in time.
Your explanation works equally well for any integer though. You could say the same of 1.
I think you're saying that it's a fencepost issue. But even for personal ages this doesn't check out: for a year after you are born, your age is "0." A one-year-old baby is in the following year.
I feel you've missed the point I was making and assumed I've made another. Age number and year number are different. You're in your first year when your age is not yet 1. You're in your second year when your age is between 1 and 2.
Years follow numbers as in "this year was the first/second/third year of ", not "this year was the year turned X years old"
Oh I see. Sure, historically it makes sense that years have been ordinal numbers. But in the modern era with all our math and computational knowledge, it is not convenient anymore. It means off-by-one errors are easy to commit when comparing BC and AD years.
This is why programming languages all index from 0 rather than 1 (knuth and lua be damned)
Because until the Middle Ages, Europeans were afraid of the number 0.
For the same reason why 1.5 is on the right from 1 but -1.5 is on the left from -1
Absolute value. Both systems count time from the same epoch, or zero point.
One year before the epoch is 1 January 1BCE One year after the epoch is 31 December, 1CE.
Half a year before the epoch (-0.5 years) is June 30, 1BCE. Half a year after the epoch (0.5 years) is July 1st, 1CE. These dates occur within the first year before the epoch, and the first year after the epoch, respectively.
If we were starting from scratch, it would probably be better to go with two year zeroes, so it would fit normally into positional number systems, and then you could even talk about 0.5AD for the relevant summer.
Unfortunately, positional numbering wouldn't be invented in the old world until hundreds of years after the Christian calendar.
The only positional numbering system I use daily (base 10) has only one zero. What system are you talking about?
Oh really? What do -0.25 and 0.25 both start with, and round to?
A reminder to read the original reply that started this thread. There's two "zero-areas" between the one points and the zero point.
Ah, I see. You're advocating for naming the intervals (0, 1) and (-1,0) by rounding toward zero rather than away from zero. I would advocate for rounding toward the lesser value: (-1, 0) -> "-1" and (0,1) -> "0"
That could work. Calculating across eras would still end up sort of funny (the putative nativity would be a year closer to 233BC than 233AD, for example), but unless you're an archeologist that doesn't come up that often.
I had another conversation about this not that long ago, and it really does boil down to treating intervals as numbers. Unix epoch doesn't officially extend to pre-1970 years, but it's defined as "the number of seconds that have elapsed [past perfect] since" for that reason, and does have a second 0. It fair to guess Bede himself didn't properly distinguish between the two, because that leads directly to an argument 0 is a number, which AFAIK doesn't appear in European mathematics until much later.
I think the only reason that the nativity would be a year closer to 233 ad than 233 bc is because Jesus was born in late December. Had he been born a week later on the 1st of January, it would work out, with 1 ad starting a year after his birth and 1 bc starting a year before (year 0 being that of his birth)
The year was built around it, not the other way. It's all derived from the Christian calendar. I'm not sure off the top of my head how Christmas ended up a few days before New Years, but they're deliberately very close. It has been argued that the real life birth might not have been in winter at all (or even Bethlehem).
I digress, though. It would inevitably be lopsided somehow, because you've centered the numbering system around six months off of the New Years points.
So in your idea there would be year +0 and year -0 before it, right?
Well, AD and BC(E) are the usual notation in this case, but yes. This is distinct from -0 and +0 in computation, because as OP says these are intervals rather than points.
floating point arithmetic on computers does suffer the existence of a negative zero. But it's generally considered an unfortunate consequence of IEEE754.