Saturday, August 29, 2020

theory of resonant aether and negative space

 first question: how do you move the aether

second question: what is resonance of the aether

third question: is a resonance possible between positive and negative space?


The question is does it occur in high di-electric fields at high harmonics a proturbation in atherions such that the field resonates in and out of negative space.  This zero point energy could then replace all cell phone sattelites as a kind of super radio without expanding waves. Or it can be a kind of tesla energy which can be retrieved with the correctly tuned receiver. 

This is a bit different than thermosphere excitement in resonant boosts. This is setting up a resonance in aether space itself.  The force would be like a pendulum with nothing to cause it to lose energy at each resonance end.  there is probably a size of the induced effect. first efforts might simply create one the size of a pea. Or what is the effect in standing in space induced into this effect, most likely deadly. 

So, if atoms cannot move the aether, nor can radiation, magnetism is caused by the proturbation so certainly it cannot cause movement.  It's a complex theory.

I find the theory of negative space pressures interesting. It's as if the proturburation moves inside and outside the space we recognize. the square root of -1 algebras and vector algebraics.  I think it's not N dimensional but rather, a space that is outside AETHER or where AETHER is not present. Thusly provide a space for more energy.  Perhaps at high energy, proturbation flows in torroid geomotry induce a low pressure or negative space.  The distance of two power lines theory of energy containment. Each is producing a negative pressure void with co-combines equidistant and then achieves a negative aether space for containment of what would otherwise produce a spark gap, a positive aether space where proturbation flow can move through. 

Dollard's Great Talk on the History of Electricity

watch all these and skip college:






Friday, August 28, 2020

What is Gravity?

 Gravity is climbing through aether torsions of torroidal ejections of aetherions turning themselves inside out over time.  a null point source proturbation in the aether inducing an effect we call acceleration.  deal with it. A magnet is simply a mostly unified direction of field inducing atomic torroidal compressions/rarefactions in materials of atomic composition which can be aligned in their torroidal ejection streams.  Like a jellyfish eating itself, the motion is an odd way to travel.

What is College?

 College is where dumb and smart people go. 

If you are a dumb person who goes to college you go there to learn from people who are smarter than you. 

If you are a smart person who goes to college you go there to learn from people who are dumber than you. 


Tuesday, August 25, 2020

Using Grammar to Develop Next Generation Language Systems

 There's so much excitement about the Neural Network implementations for understanding and finding answers in short paragraphs of text. Unfortunately this approach never scales very well. 

New attempts to add 10^19 parameters to encode it all still won't work. The reason why is that you really aren't understanding what is being said nor what is being said in the content. 

That's why we place our efforts into advanced grammar and sentence morphological analysis. This is different from Google Universal Sentence Encoder which is a N dimensional hyperspace.  When I tested these kinds of solutions they didn't do very well at all. 

We had a run in with someone with a statistics background who claimed that NLP didn't work as well. Yah, well what did they do? You always have to ask exactly what was tried. Simple part of speech? yah that doesn't work. 

Graph encoding morphological grammar is our state of the are technique but we don't stop there. We then apply a grammar transformation and world-map modeling as well as a few other things to map a conversation. I'm sure behind the scenes they are starting some of that at Google and Amazon but not qute all of it. They will always be handicapped by their core neural network approach. And I'm someone who's been a pioneer and believer in neural networks - for what they do well - for well over 30 years. I believe 1986 was my first one. 

The reason why most people don't use grammar is they don't know the theory and they aren't long time researchers in the NLP field.  It's a lot easier to get excited by the new stuff and believe thats the cool stuff. Well in some ways they are right, but not in accuracy, precision, or ability. Not yet. Maybe in ten years. They have to go through similar efforts in research. 

It can be frustrating running a language tech startup when others have a hard time understanding why what you are doing is better, until they see the results. Then it's just bang obvious. Fast. Precise. Great stuff. 

And that continues to get even better as we move into voice and conversational systems.  

Like Tesla, we build our company layer by layer, technology advance ontop of technology advance. 

Eventually we will arrive at very natural human seeming systems interacting with massive amounts of information and able to encode facts and world information in a fully autonomous form.  Maybe that's 2030.  But what other companies are striving for that? Not Amazon or Google, their approaches are intellectual dead ends, a party trick that does the basics well but won't scale.