Was missing any data from delta_t/predicted.txt that occurred after
the final integer year value. Now include the very last line of data
even when it is not on a year boundary.
Figured out that JPL Horizons and other sources define elongation
as the full angle between bodies, not just the ecliptic projection
of the angle. This brings my predictions within 0.6 hours of JPL.
Still not happy with some of the hour errors, up to 14 hours.
Maybe I need to search for actual angle between planet and Sun,
not just along ecliptic longitude.
Also should try generating JPL Horizons data to verify the
test data I have here.
I don't have authoritative test data for Saturn, so I'm
just comparing against its own calculations, after having
visually inspected some test cases against my old code and
Heavens Above. This is not really adequate but it's the best
I can do right now.
Now Uranus calculations match output of JPL Horizons closely.
I figured this out by graphing JPL Horizons data and tweaking
my phase curve formula to match.
I was able to eyeball the slope from a graph of deduced
phase curve by analyzing JPL Horizons output.
Now my Pluto magnitude values are well within agreement.
I found a paper by James L Hilton (USNO) that provides
formulas for the phase curves of Mercury and Venus that match
the JPL Horizons tool within 0.012 mag.
https://iopscience.iop.org/article/10.1086/430212
Not fully validated, but I did tweak Montenbruck/Pfleger
formulas to match JPL Horizons output for 2018-04-27 02:00 UTC.
Still need to implement formulas for Moon and Saturn.
"Phase angle" means the angle between the Sun and the Earth
as seen by a third body.
This function calculates the angle between the Sun and a body
as seen by the Earth, with a range that goes all the way to
360 degrees, allowing finding all 4 quarters of the Moon's cycle.
When searching for oppositions and conjunctions of Mercury and Mars,
dynamically adjust to their eccentric orbits by tweaking the
effective synodic period based on how far we missed the mark
on each iteration.
This brings the average for Mercury from 19 down to 6.4.
All the other planets got at least a little better.
Confirmed that Mercury is taking 19 iterations/call on average,
and Mars is taking about 10 iterations/call.
The other planets average 6 iterations/call.
Still need to verify that there is a consistent interval between
consecutive events.
Mercury is taking way too long to converge.
Mars is kind of slow too.
Need to improve the efficiency of SearchRelativeLongitude!
Using relative heliocentric ecliptic longitude of the Earth
and the other planet. Home in on when both planets have the
same longitude (the difference is 0).
For some reason, all my calculations are about 8 minutes earlier
than predictions from the test data. I suspect this is because
of light travel time from the Sun (equivalently, aberration).
Added new function Astronomy.SunPosition().
It is supposed to return ecliptic coordinates of date for the Sun
as seen from the center of the Earth.
The values look reasonable but I need to test them.
Will use the Sun's longitude in the return value from SunPosition()
to determine solstices and equinoxes.
All callers of sidereal_time ended up needing it for apparent time,
not mean time. So I simplified the code so it no longer has extra
stuff for calculating GMST.
Astronomy.GeoVector no longer iterates to try to correct light
travel time for the apparent position of the Sun. The Sun's
heliocentric coordinates are always (0,0,0), so there is no need
to do that.
Search limit can be adjusted in options passed into Search().
After 20 iterations, we should have divided the search
region by a factor of more than a million. If quadratic
interpolation can't finish the job at that point, something
is really wrong.
Allow caller to pass in pre-evaluated endpoints to begin the search.
This eliminates 2 function calls per search, reducing the
average from 8 calls/search down to 6 calls/search.
I think this is about as good of performance as I'm going to get.
The smaller the slope magnitude |df/dt| is, the larger
the uncertainty in dt. That means we are better off using
an estimated value for the slope each time than underestimating
the time error like we were doing.
This also simplifies the code, and does not make it very
much slower.
Now pass in max slope of function to be searched, expressed
in units/day. By seeing how far the function is from zero,
we can deduce whether we are within the specified time tolerance
of finding the event.
Use a simplified refraction model in the rise/set search so that
the function is better fit by parabolas. Assume constant refraction
instead of variable refraction, because it only matters near the horizon
anyway. Use a canned value of +34 minutes, which creates close fit with
test data.