Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Obligatory griping (Score 1) 209

The terran computational calendar does not define midnight.

Then you will run into another long-term problem: Universal Time (Coordinated and otherwise) is wholly independent of solar time and the concept of "day." While Universal Time's definition was intended for it to resemble solar (i.e. civil) time at its adoption, that approximation will eventually fail, rendering any system that uses Universal Time to demark and count days ambiguous.

Comment Re:Obligatory griping (Score 1) 209

True, but I would hope that future programmers would realize (somehow) that a year isn't exactly 365.25 days.

The programmer sees a discrepancy happening over a century into the future as "Somebody Else's Problem," and there would be little reason for users a century in the future to be aware of this presumption. Even if the users know about the proper algorithm, they may not know that the code they're relying on doesn't. Consider how long it was before Excel stopped rendering AD 1900 as bissextile.

Y2K, UNIX time overflow, etc. Only this time the original coders will be long dead. Best hope they commented better than they coded.

Correct. This calendar does not track the moon, or all seasons, it only tracks the year (n. winter solstice) and the day.

Keeping track of and predicting seasons is the entire point of a calendar.

+4Q probably shouldn't be used for businesses though which should probably just include 'minimonth' in the last quarter, but that's not in the scope of the terran computational algorithm.

A calendar that can't unambiguously reckon the Christmas shopping season doesn't seem to lend much utility to business.

Not exactly, if there isn't a TC designator appended to it, then it isn't a terran computational date.

Then you're seeking to have this implemented alongside the Gregorian Calendar (et al). Then what are you actually hoping to replace? Julian dates?

Ordinal numbering is not recommended for the terran computational calendar.

The attendant cultural shifts you're asking users to adopt alongside this system are steep barriers to entry that will not exactly aid adoption.

Comment Re:Thirteen months, who's on crack? (Score 1) 209

The Jewish calendar changed in 1513 BCE.

"It was a Tuesday."

From what I've seen on the subject, you're well away from evidence-based archaeology and deep into the realm of Biblical literalism.

It also marked a break with common regional tradition and a start of calculations based on national and cultural identity

There were no "calculations," or even any need for calculations, before the Babylonian Captivity and Diaspora. The Israelites were among several cultures that relied instead on terrestrial, ecological indicators for the start of spring (e.g. "seeing if a groundhog sees its shadow"). The Hebrew name of the Paschal month literally refers to the barley crop that was to be inspected (as per Exodus). The beginnings of months were reckoned empirically, as per current Islamic practice.

It was only after a non-negligible number of Jews lived too far away from the Temple (when there was a Temple) that the need for a computational calendar to maintain social cohesion (i.e. celebrating the same holidays on the same day) among the Diaspora presented itself. The modern Hebrew calendar relies on some decidedly Chaldean math that they likely picked up during the Captivity.

I know of no reason for Christians to keep track of any intercalary months, as Veadar (literally "and Adar [again]") ends by the time of any point of calculation for Easter (or more accurately, the memorial of Christ's sacrificial death)

If for no other reason than because Christians must be able to reckon the date of Easter months in advance in order to set the beginning of Lent, etc. Predicting the first full moon of spring (which defines the Paschal moon) requires some means of keeping track of lunations to know which new moon thirteen days preceding marks the proper start and to be able to count backwards the requisite numbers of days and weeks for the related movable feasts.

Predicting and setting the date of Easter requires knowing how many lunations pass between one Paschal moon and the next and how long each of them are. The Alexandrian computus settled upon by the early Church has always, necessarily, acted as a perpetual lunar almanac for the entire year (e.g. the Paschal moon is always the fourth of the year, and always 29 days long). The Gregorian method simply maintained as much of the tradition as Clavius saw feasible.

So they just call the next Sunday after that date "Easter Sunday"

The Ecumenical Councils determined that the theology of Easter demanded that it fall on a Sunday moreso than insisting that it be on Nissan 16, maintaining the symbolism of Jesus remaining dead through Saturday ("resting on the Sabbath") and rendering Sunday an "eighth day." Sunday is "the Lord's Day" (literally, in most European languages) specifically because it is the day of the week of the Resurrection.

It's not so much about precise intervals as it is about validating triggers.

On the contrary: for both the Jewish and Christian calendars it is more about the intervals than the triggers.

The defining astronomical events for the respective calendars and holiday schedules are necessarily instants (e.g. lunar opposition in Libra), and any given instant will fall on a different calendar date depending on the observer's longitude. The result would be, for example, Christians in Asia and the Americas observing Easter one week apart from each other, fragmenting the community.

There are two ways around this: declare a favored meridian ("lunar conjunction in Aquarius, Beijing standard time"), or to measure the intervening time with a standardized integer count of whole days. After all, lunar opposition doesn't actually occur the same amount of time after lunar conjunction, let alone exactly 1 123 200 s later. (Jews and Christians count mean lunations)

Comment Re:Thirteen months, who's on crack? (Score 1) 209

This was during the time of the kingdom of Rome.

We have essentially zero contemporary sources of information about pre-republican Rome and its calendar. The best we have are comments essentially made in passing by people writing in the late republic and early empire (e.g. Virgil), centuries after the fact, and these observations often contradict each other. Anybody speaking of chronology much before the middle of the republican period literally doesn't know what they're talking about.

Even with the republican calendar itself, probably the best source of information we have is Macrobius, writing centuries after Julius Caesar and the beginning of the empire.

To put things into perspective, even the Julian calendar is discontinuous before AD 12 or so, decades after the death of Julius Caesar.

(Romans didn't do math, they conquered other people to do the math for them.)

Comment Re:Obligatory griping (Score 1) 209

and a computer will probably do it for them automatically, and a lot of people won't even have to worry about it at all during their lifetime

Which is why I raised doubts that a computer would be programmed to do it correctly to begin with (it will only be on the minds of programmers near that 128-year threshold), and doubts that it would be remembered at all ("Just let the machine do it").

Each... quarter are of constant length

Of dubious value, as tropical seasons are not of equal (or uniform) length, and the beginning and ending dates of your quarters will be non-obvious.

Each time measurement unit begins at 0

Every single law and contract mentioning "the first of the month" will need to be rewritten to be unambiguous, since "1st day of the month" and "day 1" will be two different days.

Each quarter lasts for 3¼ months or 13 weeks

Only if you ignore your epagomenal days. Food and fuel must still be consumed, rent must still be paid, and interest must still be accrued. Your perpetual quarters are actually 91.310 546 875 days exactly.

This brings us to the general problem of epagomenal days to begin with: for trade and commerce, are they to be treated as being part of the 1st week/month/quarter (not to be confused with "week/month/quarter 1") or the last? And if that question can and will be answered for all purposes, why insist on epagomenal days to begin with instead of explicitly appending them to the prior or following week/month/quarter? Leaving them epagomenal will only breed ambiguity, which will breed lawsuits.

Note that, in the current system, bissextile days are explicitly part of February and additive leap seconds are explicitly part of 23:59 UTC.

Calendrical drift is suspended by including leap days in years that are a multiple of 4 but not of 128

There is no such thing as a perpetual calendar. Comparisons between algorithms of intercalary days can only be valid for about one millennium from now. Even the current standard of leap seconds itself (which allows for up to 12 adjustments per year) will break down around then.

All written dates are unambiguous.

But the relationship between cardinal and ordinal dates will become ambiguous.

zero-basing: Most standards (like ISO 8601, UTC, TAI) today define 00:00:00 to be the equivalent of midnight and therefore occuring on the next day

Not everyone within industry is compliant with those standards (I again raise the example of POSIX). You're presuming that, along with your calendar itself, not only all of industry will finally properly adopt and abide by these standards, but the general populace as well.

Consider that the general populace can't even agree on the starting point of a new day for all purposes. As an example, meteorology generally treats dawn/sunrise as the beginning of a new date, as demonstrated by the timing of low temperatures predicted for a particular date.

I had a middle school substitute teacher back in the day that told me that midnight occurs on neither side, but simply in between the two, so, technically, it's in neither day.

Your substitute teacher was expressing a personal opinion, one of several. And we haven't even gotten into the definition of midnight itself.

Comment Re:Obligatory griping (Score 1) 209

Personally I like the simplicity of standardized units and I'm still happy that the 28 day month is still in between the sideral (~27.3) and synodic (~29.5) periods of the moon.

True, but the synodic is far and away the most influential on human affairs, including factors such as natural nighttime illumination, tides, and our reproductive cycle. The synodic month is the one that can be determined even with overcast skies.

Omitting a leap day every 128 years was not chosen because it was a power of 2, but because it is literally THE MOST EFFICIENT METHOD that exists to keep a year synchronized with the same point,

I'll grant that it's probably more precise than the Gregorian algorithm for the next few centuries (though I'm going by the numbers for the northward equinox rather than the southern solstice specifically), but it's not more efficient in decimal math. Determining divisibility will require inspecting more than two significant figures, unlike the case with 4*10^n.

It's also the kind of obscure number liable to be forgotten in the intervening time. After all, it would be even more efficient to use a strictly quadrennial, Julian arrangement and assume that your particular piece of code won't still be in use by the time the difference matters, a la Y2K.

Having a simple, accurate, efficient "computational" algorthim is what this calendar is all about.

Have you considered the accuracy versus efficiency of table lookups? Chinese New Year is a rigorously defined astronomical phenomenon, but most people just rely on a published calendar and don't ask questions. I'd even wager that even most Jews have no idea how their calendar works.

Historically speaking, having a calendar mathematically simple enough for the average person to reproduce has been a European cultural artifact reflecting European values and experiences (e.g. a distrust or lack of a central authoritative source), and there need not be any objective advantage to such ease of reproducibility.

Besides, everybody is familiar with our current system's zero-based (24hour clock) hours, minutes, and seconds any ways, so why not the rest of the date units, right?

That's actually a bad example, as the date of any given midnight is currently ambiguous. Does it belong to the prior day, the later day, to both, or to neither? Statutes and contracts in the US (at least) typically specify "23:59" or "0:01" specifically to avoid that problem.

Comment Re:Thirteen months, who's on crack? (Score 5, Informative) 209

Actually, back in the pagan days, there WERE 13 months

Embolismic months are not constant, but are inserted because there is a difference of about 11 days between 12 synodic months (~354 days) and one tropical year (~365 days). An embolismic month ends up being added approximately 7 years out of 19, by different algorithms according to different cultures. And even if you were intended to include Jews (and their occasional "Adar II") among your categorization of "pagans," even Christians keep track of embolismic lunations in reckoning the date of that faith's holiest day (in the Gregorian Calendar, May 30 is the first day of the seventh lunation out of thirteen in AD 2014). The only major religion that absolutely, positively insists on a year of 12 months for all purposes is Islam.

The year started in spring, and December was the 10th out of 13 months.

It was the tenth of ten months; the early Romans likely reckoned winter as extracalary. January and February (and Mercedonius/Intercalaris) were added later, probably when what passed for Roman astronomy became relatively more sophisticated. And it wasn't only "pagans" that insisted that March was the first month. The last major hold-out, the United Kingdom of Great Britain, didn't change until AD 1752 (AUC 2505). And not all "pagans" were or are Roman.

Comment Obligatory griping (Score 3, Interesting) 209

Synchronized with the northern winter solstice,

By their nature, solstices are notoriously difficult to determine empirically. Theoretically there is an instant when the the sun's declination reaches its minimum, but practically you'll have hours or even days of a change in declination that is too small to measure. Popular surviving calendars either rely on an equinox instead (Christian, Jewish), or pad several lunations after the solstice just to make sure (Chinese).

the terran computational calendar began roughly* 10 days before

Whose ephemeris?

Each year is composed of 13 identical 28-day months

Two figures that generally have nothing to do with natural phenomena. While it's true that a little more than one-third of all tropical years contain 13 synodic months, those months average to around 29.5 days each. There are cultures that care about the synodic month exclusively, and there are those that care about both the synodic month and the hebdomadal week, but I know of no major religion or regionally dominant culture that cares about only the hebdomadal week.

followed by a 'minimonth' that houses leap days (one most years and two every 4th but not 128th year)

We limit calendars to arithmetical processes because accuracy must be balanced with ease-of-use for human beings, and we tend to prefer powers of ten because that makes the arithmetic easier for humans. If you're going to insist on powers of two in your calendar, you're effectively requiring people to reach for some sort of computer to perform the algorithm for them (except for those rare few who enjoy performing long division). And if you're already doing that, there's no longer a reason to limit your calendar algorithm to arithmetical (or even algebraic) processes to begin with; just have a computer chew on the transcendental functions directly rather than limiting it to an arithmetical approximation to begin with. Shoehorning in a power of 2 is a compromise that satisfies nobody.

and leap seconds (issued by the IERS during that year). Each date is an unambiguous instant in time

Coordinated Universal Time and it's system of coordinated leap seconds is older than POSIX, and yet even today POSIX still can't get leap seconds right, insisting that each and every day is exactly 86 400 s long (which is a big part of why we're having our current Leap Second Holy War to begin with). IT has been kicking that can down the road for about 40 years. Why will an adoption of your calendar suddenly change that?

that exploits zero-based numbering

Programming languages can't agree where to start an array, but to my knowledge nobdoy is currently using a calendar with a "day 0" or "month 0" (let alone a "zeroth day" or "zeroth month"). Insisting on "zero-based numbering" doesn't solve anything, but rather dumps IT's own internal issues with counting onto the rest of the world.

Slashdot Top Deals

"If I do not want others to quote me, I do not speak." -- Phil Wayne

Working...