Author Topic: 2024 Hurricane Season -- Azores High, Plankton and Algae; seed the ocean?  (Read 2385 times)

0 Members and 1 Guest are viewing this topic.

Offline zpiro

  • Member
  • *
  • Posts: 2
Quote
Dear NOAA,


https://www.nhc.noaa.gov/gtwo.php?basin=atlc&fdays=2

This concern a two fold concern:
1) using planes to puncture systems and clouds to reduce energy
available.
2) fight undesireable algae blooms.

Problem being that it delays the problems, while aggrevating global
systems. Thermal boundraries may move towards wanting to move moisture
across the Ferrel Cell.

https://earthobservatory.nasa.gov/global-maps/MY1DMM_CHLORA

Coast of Africa to Americas do not have an excess of,
https://www.nature.com/articles/326655a0

Quote
The major source of cloud-condensation nuclei (CCN) over the oceans
appears to be dimethylsulphide, which is produced by planktonic algae in
sea water and oxidizes in the atmosphere to form a sulphate aerosol
Because the reflectance (albedo) of clouds (and thus the Earth's
radiation budget) is sensitive to CCN density, biological regulation of
the climate is possible through the effects of temperature and sunlight
on phytoplankton population and dimethylsulphide production. To
counteract the warming due to doubling of atmospheric CO2, an
approximate doubling of CCN would be needed.

The amount of metals and minerals required for plankton and algae
isn't extremely high. Plankton seeming to have two ideal temperatures,
18 and 25 Celsius, algae between 20-30 Celsius, where optimal is closer
to 27.

NOAA has access to the military, suggesting something to do, where the
questions should be costs and how it can help them. With means and
responsibilities involved, it can be considered to suggest that the
Atlantic, and especially important regions are made clorophyll green.

A simulation may not be needed, just that marginal efforts has the
potential of making major hurricanes worse, while possibly stop gap
measures seeding raind and clouds easier. And inherently makes perfect
sense, and can be supported with questions, if the problem would be less
if more whales where defecating while sleeping in surface waters around
Azores high.
Not to mention off the coast of Africa, expecting many more
smaller systems, and less chance of bigger systems going deeper below
surface waters to pick up resources for growth. Stimulating desireable
environmental activity and nature on the correct side of a wrong
development towards and ice age --- from questions of how far extreme
cold fronts in the north need to go, for ice coverage to go far enough
south for algae blooms to reduce effects of extreme weather with
excessive energy absorption and evaporation around equator.

Would you sign off on this, as generally the right thing to do in Atlantic Hurricanse season 2024 and beyound?
Basically, an operation 'whale shit' affecting cloud formation, and plankton especially may try hard to keep surface temperature below 25 celsius.
This is entirely based in the earth as a thermodynamic engine, and more classic weather models.
https://groups.seas.harvard.edu/climate/eli/research/equable/hadley.html

I don't see a problem with this, but may need the scale and try variations.
Plankton also able to move, and seemingly prefer 18C and 25C according to publication, compared to a range for algae.
Sattelite image can't be trusted for this and other reasons, just means its deeper and coming from elsewhere.
While somewhat flirting with, "be the keystone sperm whale, take your yach and file iron rust into it's habitat".
I think it's fool proof and needed around exactly this, not like marine life closer to, what is it, 60 meters something
where it wouldn't have mobility or energy to stimulate clouds to protect itself, so just need big waves to lift it ut instead.
Because what is involved with sperm whale in the food chain? Squid at tremendous depth, sleep and defecate in the surface.

But it IS thermal, what else can you do, when guessing on being snowed in or hit by hurricanes?
A bit more to the surface, I believe sperm whales were named before microscopes, and I suspects explosive surface shits; to have entirely wild guesswork included.
Marine life may be the most realistic Gaia Theory basic ecology to stimulate, if hope is lost for rain forests, both as carbon sink and regulation.
I have no problems with asking, and realise that professional forecasters or scientist wont propose it loudly, others can maybe agree on the question and ensure answers are demanded. So why not, try-hard and commit a Navy fleet to pump ocean from some depth and add nutrients needed and simulate it well, where proper sattelite monitoring and not mere clorophyll is in demand? Don't ask me what the white dots that pop up actually is, could be several things, and maybe the year 1880's 36 times population of sperm whale with its behaviour would do something today, infamous for sinking small sailboats, or thought to when woken and panicking -- as bad as floating containers, but killed by larger ships. And obviously involve all whale species, but Azores is a key system, NOAA should in principle be able to ask for it done at a specific region in the Atlantic?
Enjoy: https://www.instagram.com/discovery/reel/Cx53-U5tl2Q/
« Last Edit: June 22, 2024, 03:27:38 PM by zpiro »

Offline zpiro

  • Member
  • *
  • Posts: 2
Did they, are we playing with Azores high and weather coming out by carbo verde?

If so, I have a second request; it's not a huge secret to what extent weather, and what one believe about the weather has mattered for the world military powers, think Normandie and weather reports?

That being said, if one wants mathematicians to play with the math avilable in the world of physics, the compute available to do weather models across defined systems affecting large surrounding systems, affecting planeterary systems and all the way back to systems steered by global patterns for high accuracy well into the future with current state of measurements and available compute.

Declaring it public domain work and public good banned from being classified would be stellar, for the the things that can be done globabbly with unstable models run in many places in the world is the rule, around math that threaten to error into infinty for usefuled applied mathematics.

Beryl somewhat indicate that various models are just mixed matched, especially global patterns that help steer systems, with how internal development and local effects can aid guesswork. Perhaps the height of expertise is experience with models and accuracy in different circumstances, combined with excessive amounts of compute to poke and prod at predictions, perhaps better done for tornadoes and surges?

Something should maybe be done to make it more attractive to mathematicians interested in the relevant math along with informaticians and combinatorics efficient and practical on hardware available. Not entirely in the legacy stuff and software frameworks with layers of wrapper scripts and originally run on punch card computers from IBM, before Cray specialized in making it run and give correct results with their own math libraries.

You need to be passionately interested, with rare overlaps for weather and mathematics to not heed advice to make a wide birth or stay well clear of these things.
A reason why tender contracts come with compiled and built software and verified results and performance. I'm told there is a perl scripts that wraps around configuration options that used to be an IBM punch card.
https://forum.mmm.ucar.edu/threads/compile-wrf.14604/

Obviously, there are alternatives for university mathematicians not habilitated around national labs and institutes with specialization and large support staffs, while there should be go-to software for things relevant to weather simulation, which is pretty much anything in applied physics and engineering. WRF is legendary for requring a lot of work to set up and support.
https://en.wikipedia.org/wiki/Mesh_generation
And well cleaning up open source software frameworks is easier if not under pressure and motivated to study where abstract and analytical maths go to infinity in the calculations and throw off results. So most buy Cray to ensure reduced or added floating point precision don't wreck things, and otherwise motivated and pressured to fix details and otherwise make it faster and crash less. And maybe institutional politics for the usual univeristy mathematics groups isn't attractive to get around the technical learning curve.

And I'm not sure if plankton or algea will be done yet, but if anyone meticoulus with attention and detail with an interest in the application of WRF start fixing it, I'm sure HPC sysadmins, mathematicians and metrologists will love you forever for a code base and build system that intimidate and ward off thoughts about trying to address legacy baggage and tehnical debt piled so high that a new hire able to would be made to do other things. Having worked in HPC myself, your time gets eaten up by a field medal mathematics institute professor having bought three servers with dual channel infiniband cards, asking for help, and while infiniband support token ring network that you now need to do; it's a crossbar switch technlogy that require a switch to centrally create and negotiate physical links down to the electrical level, or buy a switch slightly more pricey than a 4th server --- those guys won't engage national labs or spend too much time on WRF. But reason to be, and on principle for public good, and inherent morality to be skeptical; it's hard to beat HPC and mathematics overlap for mathematician with interests in applied mathematics. In either case, I assume my decade ago advice still applies and certainly, two machines with direct link HBA cards with RDMA functionality lets you play with all things HPC with lower latency(28-100 nanoseconds level --- 40Gbits RDMA had cpu-core to cpu-core ping pong latency through a switch at that level a decade ago, for total of 0.9 microseconds -- a program running on two machines, asking 0 and getting answeer false). And a mcriosecond is an infity for a FPGA programmer, why would you bother with stepping stones to national labs with support? Public service get shafted so quick in avilable technical support. In psychology there are things like spotlight effect, barunum effect, cognitive dissonance and the like; it just get worse the more connected and able you are. And WRF is at the top of list on technical debt, plowing through and labouring it, all complex problems and issues are likely to just get solved.

I suspect, it may very well be what he wanted to run,
"WRF has grown to have a large worldwide community of users (over 30,000 registered users in over 150 countries), and workshops and tutorials are held each year at NCAR. WRF is used extensively for research and real-time forecasting throughout the world."

IBM punchard legacy crud, you better believe it, and tremendously important! So much that nobody will... put institutional authority and leverage behind efforts that may fail as, "works for those involved that matter".

And while a long winded and edited follow up post, what do people actually epxect? Some mathematical questions aren't borderline spiritual, number theory is occasionally referred to as pure metaphysics. As far as logic problems go, what the minimal action physics -- planck -- is for various math problems and challenges, and its possible aplication with inherent preference for public work; with mathematics historically placed as a moral science, rather than practical or knowledge based per say. There is a going joke in HPC about "chaining" someone down to use large parts of a big system around problems, and mathematics relevant for some problems in physics and simulation quickly involve carbon footprint of a small village in compute. And after all of this, if you think every amateur mathematician that runs WRF doesn't matter, and that no matter the size and credibility of a national lab with so many countries used to the status quo, why bother, as the world know open source, and public good when it is. And yeah, I argued to declare the activity public domain in domestic law, but how absurd is that?

Personally, I'm not sure if ability to understand and discuss matters with a broad range of scientist made it reward, or what made me good at it; but have am generally better at the absolute fundamentals, especially electrical mechanics and abstract logic. But professionally, chaining someone down to a HPC system is something I find funny, between spending time for a Ph.D student for a visualisation request for the thesis; or using a material scientist to beta test a new system as scientific code developer, or institutionally offering someone  a motivating level of support and compute to continue work. There are a whole not of framework avilable to build HPC software and scientific codes, one of them is EasyBuild that is far removed from the expected distribution support and avilability given public interest for the science. And rule of thumb, for less complex codes is about two weeks to set up, verifying and checking performance; WRF is a special case. And about that, it's the flagship code of this project out of Belgium, https://docs.easybuild.io/typical-workflow-example/. Quite verbatim at this stage, but nothing is exagerated. And I am fairly certain the mathematician I built WRF with EasyBuild for wasn't paranoid, but not sure if they guy needing to run token ring on his private research servers would be, and is a funny thing at the end of the day, those able to with higher principles and agendas involved may trust the system more so than anyone one or many around themselves. And pretty much, the best mathemaitcians with some scpeticism can -- go -- well, I know how important algorithms aren, and ability to write code, I can just buy a machine and no world power would dare mess with it, but socially, can I bee sure about HPC guys? Let's call it the mother of all litmus tests a mathemician can do, involving weather forecasting.
« Last Edit: July 13, 2024, 08:41:56 AM by zpiro »