You can change the starting sea-level pressure to be anything you want, but if you enter in 29.3 meters the difference will always be 3.51 hPa for that elevation.
This does not appear to be the case, at least not on the calculation web page being linked to.
Scenario 1: If I enter my elevation (282 meters) and leave the temperature at 15 degrees Celsius, converting an absolute pressure of 980 hPa yields a sea level pressure of 1013.22.
Scenario 2: If I enter my elevation (282 meters) and leave the temperature at 15 degrees Celsius, converting an absolute pressure of 1020 hPa yields a sea level pressure of 1054.58.
In Scenario 1, the difference between 980 and 1013.22 = 33.22.
In Scenario 2, the difference between 1020 and 1054.58 = 34.58.
The gap appears to vary by 1.36 hPa. If I go more extreme, from 950 hPa to 1050 hPa, the difference in the gaps is 3.39, about 10% of the gap magnitude.
My main question, then, is how can a weather station that uses a constant gap ever be accurately depicting relative pressure? The console should be asking for elevation and calculating relative pressure, as it would then have all the information it needed to accurately fill in the equation.
At this point I am probably just going to back into what my absolute pressure should be by trusting the errors Gladstone QC is summarizing for me. I think I have dialed in my gap between absolute and relative as well as I can for a constant (that shouldn't be a constant). I just used an absolute pressure of 1000 hPa.
Thanks for all of your informative posts!
Thanks,
sutekh137