Gatlin - IoT Farming Platform

Gatlin IoT Farming Platform, aiming to utilize cryptonetwork utility to run algorithmic transaction to do transactions based on developing automated and efficient networks to grow food and add a tangible back to an intangible currency. Pooling resources to grow food and promote healthier and more sustainable land management.

Stack (30/May/2021)_

  • Kotlin (main and embedded externals)
  • Gradle w/ Groovy (main)
  • Micronaut w/ http4k (maybe gRPC) (main)
  • pgSQL for the whole data store (main and embedded exterals)

Main Cliffs About The Project

  1. Plain truth that crypto currencies as they stand are a disaster. Insane monetary value fluctuations make crypto a bad place to invest your money. Thats is where the agricultural aspect comes in essence, a crypto network generally gains value by being able to pool resources to mine for the network. To run algorithms for to do a, b and c. So what are a,b, and c and how can I utilize this desire to collect and mine resources to my advantage? For easier conceptualization we can look at this as another avenue to buy and sell futures. Food is a renewable source that has firm and immutable timelines (this won’t be a monsanto op). I also believe that building out systems to better utilize physical resources to be something a crypto network could do. Instead of hacking me and mining my GPU’s, they can mine the field and help grow food. More on that later. Since we are still dealing with futures, we can expect do a smoother transition and offer not only quit transactions. but also off long term financial plans

  2. I believe the functional style and the singleton mindset, coroutines/immutability constraints and general design of Kotlin perfect for not only building out machine learning services to advance food science, but I also believe it to be the perfect language to use in a “bastion” server for the fog model I have planned out in terms of handling resource management of peripheral devices. I don’t see Kotlin now as a place to take the embedded stuff (that would be cool), but it acconels everywhere else I need it and its just fun to use.

  3. The project will be using Kotlin for the embedded part of the embedded land management system. New development as of 30/May/2021 as I just found out the native platform can run on embedded system without the JVM being needed as well as it having interoperability with C. This is great news as it eliminates another upkeep issue with multiple technologies.
    The embedded systems will run various tasks determined by the Bastion Scheduler “EtherAgent” which then designates it a task depending on transaction requirements. With the C capabilities, I feel better equipped to handle the low level tasks needed to keep a lean peripheral and minimize resource expenditure issues with energy saving requirements needed to make the usage of peripheral devices a viable option to do long term work with a single battery charge.

  4. The company is going to get started in two places to do an alpha prototype and an alpha deployment. First step is to utilize current tech to start simulating communication interfaces using home automation tools to establish a top down ability to communicate. From there I plan to get started by using grow operations to move forward and develop the laser reading mechanisms, the image processing mechanisms and build out scalability to be designed around the ability to operate in potato fields, wheat fields, corn fields. To have this protect actually work properly the network will have to extend into a variety of food markets to gain any adoption to it or at least its concept centered around sustainability. The basic idea is to use kotlin to manage a bunch of mini node servers running land management operations to produce value.

  5. NLP based security system based around cochlear mnemonics and vernacular. This is to train and get off the ground. I chose this path because of the transient but effective uses of language. I basically want to build out a bunch of chatbots that speak swahili backwards, real wind talkers to handle encryption and decryption. I believe that language and the ability to get our AI and Neural Networks to be able to develop its own kind of “vernacular” to be a very good step forward for cyber security. If we can handle potential threats at the input box, I think we should. I have some ideas about how to go about this, mostly utilizing the flowy and no blocky good stuff in javascript. I’m benching the development of this for now as I am not confident in making a decision or starting that development work up

  6. PgSQL can handle the entire datastore. It can handle the functional db’s the ints, the prods, the app db’sm, the nosql with its json and bson capabilities. Its 100% open source and it is a tried and true platform. I know theres a lotta push for “graph” databases, I don’t know the benefits and I won’t pretend to. However, I believe pgsql would do just fine with data warehousing and machine fine tuning with a good set of star tables I believe SQL to be more than capable of handling these types of relational requirements. I’m using the json/bson functionality to store configuration files with are the only bilateral part of the current UDM concepts. From what I read it handles itself very very well. Plus, I never hear about the pg team in the news. But I don’t. talk to people much, so I may be wrong.

  7. I plan on using Docker Compose for the container management. There are two reasons behind this: I do not like the idea of the HUUUUUGE yaml file I’d have to write for a K8 Cluster implementation and docker compose allows for runtime configurability which is a huge plus for me considering the current conceptualization of how the pieces work together and how the UDM and configuration setting work. I could be wrong, again, I’m really out of depth in the undertaking.

  8. It will be an Agent Model System. This is the right way to go for architectural decisions and especially in facates like IoT/Embedded Systems. The closer we go to the edge, the more compartmentalized out peripheral and mesh systems need to become. In a sense, almost independent. For that reason I have a hierarchical structure resembling the military and agents are akin to military grades in terms of their scope, functionality and necessity. With the current UDM I am utilizing a “UpChain” and “DownChain” methodology. Commands only go one way, one route. Either up, or down. Now to distribute commands to such a system I am incorporating the “Agent” type to be the bilateral messenger. In a sense I want to build the deployment services out to run their own configuration schemes based on policy changes coming from upchain agents who then send orders down chain to the peripherals to do work.

  9. All communication is going to be routed through the sidecar. the sidecar will handle things like state management, audit logging and general system necessities like heart-beat checks. With that I also need to incorporate a GNU Radio like service to handle EM wave sanitizing. This is a place I would really like to use Kotlin to do the heavy lifting. I believe Kotlin to be a great language to do and handle various signal processing services.

  10. I need guidance on what to use for the machine learning parts of this project. Earlier drafts focused on BERT/TensorflowTFX/Flink Flip-39 Runner to handle AI. I also had an ONNX draft. I would prefer to stick to my guns and go with my gut: Kotlin. I’m just unsure of what that looks like in present.

I appreciate the Kotlin member for allowing me to post this hear. It’s been a good help to get this out on paper.

Alex - Update 30/May/2021 I

I am unable to make update edits to the post, so I have to reply.

I am moving forward with the regex approach to various information digestion. Namely, in circumstances where I will be getting graph data, multidimensional matrices and things of that nature, the project is now moving to develop digestion techniques based around about character/value recognition at the door instead of computation.

Using the multi-line string capability in Kotlin I was able to digest index and value information from a multidimensional matrix using only the definition of the matrix in string form and then utilizing regex to pick out the values from non-important data and map it with a single line. I am very stoked. This will make linear algebraic operation a lot easier and cleaner than using nested looping techniquesr.

I am pushing the String approach because I believe it to be a better way to approach the cost of calculating. Instead of finding out too late that my process is far to much for me to handle, I can now assess it as a string, dynamic and precise, it is a set of different patterns of data instead of reifying the actual value it is meant to represent. To make it more clear, understanding that infinity is limitless and a concept of eternal, it is an entirely different thing to sit down and try to compute number so absurdly large that the mere conceptualization is neigh impossible.

To break it down more, saying “no thanks” before you open the nasty grams of non-viable requests is a huge step forward.
I can define and provide integrity while not having to risk costly digestion processes as the process acknowledges the value exists, but it doesn’t try to allocate space to reify it allocation.

Last update, I am starting perl and am hoping to do some integration more with Kotlin using it. Perl can do embedded systems, and from what I can surmise, do it very very well. Syntactically, I would like to test its incorporation to build out NLP AI that can recognize certain scripting languages and proceed in an unsupervised fashion to communicate to each other utilizing the more colloquial syntax that Perl provides as saying “my banana” has more meaning to NLP than does “var Banana”.

That is a good update and development in this sphere.
val route =
T --------------M---------------I

This is the floor plane coordinate plot used. The borders are single space literals, and the lim
it’s returns are the carriage returns that down-
-shift the integer values.

  1. We first run a regex to get the X plane quadrant limits, to make sure it is even, and establish the middle (M), to
    give us quadrant1 (I) and Quadrant2 (T). Any position from Position T to Position M is considered -X (Java/Kotlin reads from
    Left to right). Anything from Position M to Position I is consider +X. The quad limit +/- div 2 must be 0. So Abs(T-M)==Abs(I-M)
  2. We run a regex to get Y Plane coordinate limits. Everything from Position T to Position E. Everything from Position T
    to from M–O–M below is considered +Y and anything from Position M to Position E is considered negative.
    The quad limit +/- div 2 must be 0. So Abs(T)==Abs(E) and validate the limit
  3. We run a validation check to get the limits of Quadrant4 (F) and validate (Pos(E)-pos(V))==(Pos(T)-pos(I))
  4. Once limits are establish we store it a semi mutable object where we only validate that the limits have not changed
    when there is a call from anywhere on the coordinate plane when we get dynamic and changing stream data
  5. Pos O is the Origin point of the planes, and the crossing of the Origin point is marked as the point where we transition from upper quadrants to lower ones.
  6. Repeat the above process to validate the Y limits.

This serves as a mapping of general peripheral locations on a larger scale and allows for more
granulated mappings. This is a cursory prototype, but in the end I am pushing for the recognition of color through hex values on a map overlay as the object of ingestion.
This is just a small scale conceptualization that it is possible.

I have not yet implemented the above steps in the repository yet as I intend to make changes I will have to downshift / upshift depending on ranges. The mechanisms of generating the data grid are better sequential, small and incremental as it better allows me to debug unintended output. As it sits right now, I plan to use this grid ingestion/digestion method to run configuration scripted to the peripheral devices of a set area of land. Each map should be able to granularize an array and for the UI we can have a literal representation of that our pipe was initiated as and allows us to better move peripheral devices.

This mapping sequence only needs to run once as it should establish a tangible area and would allow the mapping and remapped of peripheral objects as the data grid will have coordinate values for the map and can zero out and re-initialize a different position on the map.

Thats the progress, roadmap and reasoning I have with the project. I use monospacing for the current implementation and the grid above is just to help conceptualize the idea

-I still might have made a mistake in writing this, and if so I apologize.

1 Like

Quick Update. The project is now called: NerveBus, and I am repurposing the work done to better fit the actual modelling

This will be the logo

The Stack is as follows, and if anyone reading this has any feedback, it’d be much appreciated.

I personally do better with a visual aid/data-flow diagram, so here is the build out:

The MAV Swarm originally meant a MAV Swarm with a semi-decentralized hive mind, so the wording on that is needing to be changed. However, I want to replace it with a mesh network of peripheral devices.

Also, the Alpha Channels and Delta Channels aren’t going to be used and instead, the OAI eNodeB is now to act as a “Bastion Server” hopefully using some C/++ Interop.

I am looking to implement Akka Streams, and I’ve been working to learn better the concepts of concurrency, async parallel processing, crash management and data flows in the Actor Paradigm. Needless to say, it is a fascinating subject but also one that can make the head hurt.

Sub System: Homeostasis: K8 Clusters have a built in side-car. So the ingress data will be event driven, and after that I’d like to have different Akka Streams to handle the ingress data stages and egress data stages. The Homeostasis subsystem also handles the system transition of state using the BBBarrier Agent (Blood Brain Barrier Agent) to handle whether the message is escalated to induces a change of state and data flow.

Sub System: WaveChef - I’m looking to implement GNU Radio Recipes to handle signal sanitization to account for potential malicious subpacket injection from the Sigover attack vector and other potential security issues. It will need to do a decibel strength and latency check as the margin of error in is getting slimmer by the day and having to account for systems already accounting for chip corrections at a circuit level is a very big uphill barrier.

Sub System: Optical Cypher - This will revamped as I want to start implementing a type of “fluid” encryption based on mnemonics and vernacular. In essence I’d like the end stage to be NLP AI being able to be like Navajo Wind Talkers as encryption. Using Object Recognition and sentimental analysis, I’d like to be able to toy with the idea of unsupervised learning to see if a “local” vernacular can be established and utilized to encrypt and decrypt messages. The best secret is the one no one can understand.

Sub System: PreFrontalCache - I’m looking at using a Redis Cache to store messages and transactions still in flux as to better know how to re-allocate resources. The pipeline is modelled after infantry division commands, so “divisions” will be able to do strategic re-allocation of power further down range.

Sub System: VagusNerve - This is essentially the mainline of communication with the datastore and the outside system. I want to incorporate the idea, however, I don not know if it would necessarily fit with the principles of the Actor Paradigm.

The data store as of right now is mainly PostgreSQL. It has the no-sql capabilities with the bson/json, it should be more than capable of handling warehousing needs, or I don’t see where it would fail. Again, this is a question I have that I don’t know how to find the answer to.

That covers it for now, I am currently working on getting the documentation back as shady OSSINT and Infosec companies hacked me last October destroying the project and prototyping that went with it. I’ve only managed these two resources and I’ve lost a fair amount of the technical knowledge. Given the scope of the project, it would have been extremely unlikely I could retain what I lost if I cannot access the docs.This does, however, provide a lot more structure around a massive project.

A journey of a thousand miles starts with a single step.
And, if the wind will not serve, take to the oars.
So I’m out here, rowing.

1 Like

I decided to go to C++. Ya’ll take care.