this post was submitted on 10 Sep 2023
26 points (90.6% liked)

Autism

6857 readers
273 users here now

A community for respectful discussion and memes related to autism acceptance. All neurotypes are welcome.

We have created our own instance! Visit Autism Place the following community for more info.

Community:

Values

  • Acceptance
  • Openness
  • Understanding
  • Equality
  • Reciprocity
  • Mutuality
  • Love

Rules

  1. No abusive, derogatory, or offensive post/comments e.g: racism, sexism, religious hatred, homophobia, gatekeeping, trolling.
  2. Posts must be related to autism, off-topic discussions happen in the matrix chat.
  3. Your posts must include a text body. It doesn't have to be long, it just needs to be descriptive.
  4. Do not request donations.
  5. Be respectful in discussions.
  6. Do not post misinformation.
  7. Mark NSFW content accordingly.
  8. Do not promote Autism Speaks.
  9. General Lemmy World rules.

Encouraged

  1. Open acceptance of all autism levels as a respectable neurotype.
  2. Funny memes.
  3. Respectful venting.
  4. Describe posts of pictures/memes using text in the body for our visually impaired users.
  5. Welcoming and accepting attitudes.
  6. Questions regarding autism.
  7. Questions on confusing situations.
  8. Seeking and sharing support.
  9. Engagement in our community's values.
  10. Expressing a difference of opinion without directly insulting another user.
  11. Please report questionable posts and let the mods deal with it. Chat Room
  • We have a chat room! Want to engage in dialogue? Come join us at the community's Matrix Chat.

.

Helpful Resources

founded 1 year ago
MODERATORS
 

After chatting with some of you on this forum and seeing that we all are on Lemmy rather than Reddit, I think it would be a good idea for us to have some study groups to improve our technological literacy and competency.

During my time on Lemmy, I've been able to increase my digital literacy and overall knowledge surrounding my system. I've loved the nearly endless rabbit holes Wikipedia has pulled me into, as well as the resulting happiness that comes from finally fixing a broken Linux system or piece of technology.

But what exactly does technological literacy encompass, one might ask? I'd like to illustrate via anecdote. When I first got into Linux, I was told to "Get a terminal emulator to SSH into the HPC so that you can run computational jobs". To most of you this sentence is completely normal, but to my unconditioned mind, I felt like a big bright light was flashed before my eyes while my PI spoke martian to me. After the initial disorientation, I downloaded what I thought was my only option for a terminal emulator (MobaXTerm), and found myself sitting in front of a pitch black terminal screen with a blinking prompt. Not knowing what a host was, how to manage a network, any Linux commands (coreutil, never heard of her...), or really do anything past opening up WoW and Google Docs. The only things more advanced than the plug and play Google/Microsoft software solutions I'd use, was my botched LaTeX setup. I used it to typeset math equations for my students, homework, and lab reports from how much faster I could type in the TeX format than click on every Greek letter/symbol I needed. Overall, it really messed with my ability to do the research I was tasked to do. I was supposed to learn how to use Vim as my IDE when the only IDE I had ever worked in was Spyder from Anaconda! VSCodium, CodeBlocks, Emacs, etc, I did not know that any of these existed.

Needless to say, this was extremely discouraging to be thrown head first into a difficult scenario with very little assistance whilst trying to juggle coursework and outside responsibilities. Humble beginnings reinforced in me that if I experimented with my computer and messed up on the OS side, that I'd brick my hardware and have some variation of Homer Simpson holding up the "So you Broke the Family Computer" book.

I'm sure that we all come from varying origins of computer literacy, which IthinkI've proposed a couple of possible areas of study, that we could set up in small or large groups depending on interest. The frequency, literature references (textbooks, white papers, blogs, forums, etc.), and the project goal (could be concrete or abstract) should be drawn up and worked towards to keep the topic focused. I've come up with a couple of fields for us to start with, feel free to add to the list or modify what I've written.

  1. Cryptography with a rigorous mathematical foundation applied to both classical and quantum computing paradigms (AES, RSA, Hash functions deeper than just the surface, information theory (We love our boy Claude Shannon), Cryptographic primitives, Shor's Algorithm, etc.)
  2. A hardware agnostic study of firmware (What are some unifying principles about firmware that can empower the user to understand why certain aspects of the device are not functioning)
  3. Hardware architectures (GPU, NPU, TPU, CPU, RAM, DIMM)
  4. Form factors (How geometry can impose certain design decisions, and so forth
  5. Fundamentals from First Principles, i.e condensed matter physics theories to understand the classical computing systems. The group can also choose to segwey into topological states of matter (Dirac fermions, Weyl semimetals, Mott insulators, and a myriad of other cool matter states that aren't really discussed outside of physics / graduate engineering classes) Qubits (Bloch sphere representations) and loads of other things that I'm sure exist but am unaware of.
  6. LLM Inference technology and how it can be applied to case law, accounting, stocks, and various other fields where the solution to the problem lay somewhere in an encoded technical language.

I'd like to begin the discussion with this as our starting framework, does anyone have any interest in the topics listed above or suggestions for other subjects? How should we manage these groups? Should we add some chats to the Matrix instance?

you are viewing a single comment's thread
view the rest of the comments
[–] nickwitha_k@lemmy.sdf.org 3 points 1 year ago (1 children)

Thank you! I would say that I have the most interest currently in hardware architecture and recently got an FPGA board to try to learn more about it, especially RISC-V and vintage/extinct architectures, as well as GPU (at some point, I want to build an entire computer that is capable of running modern software, completely on open-source hardware, gateware, firmware, and software).

I have a bit of interest in cryptography but, my ADHD has been a bit of a blocker from digging too deep into it. And less interest in LLM, though I am supporting some coworkers in their interest in it - probably a bit fatigued from hype and wishing it were actual AI.

I would indeed say that, skill and experience-wise, programming languages, virtualization, and operating in Linux are indeed well in my wheelhouse. I mainly use neovim as an IDE and program in Go and Python, though I've learned a bit of Rust (embedded), Lua (for neovim), and C (mainly Arduino). Frontend is less in my interest/skillset.

[–] gronjo45@lemm.ee 3 points 1 year ago (1 children)

I've heard interesting things about the RISC-V architecture which have made me want to look more into it. Logic design is something that I feel isn't nearly touched upon as much as it should in other technical disciplines. My undergraduate engineering coding experience was complete shit for lack of a better term.

Another reason why I think a group of highly-motivated, like-minded neurodivergent individuals could help keep each other on focus towards a grander goal. The fatigue definitely is real, sometimes after learning 200 acronyms through Anki and trying to actionate what I've learned from them can leave my brain all numb lol

I think that having a masterclass to get everyone on board with using Vim/Neovim/Emacs (Something at least that has much more versatility as an editor) as well. Embedded systems have had particular interest for me too. I took a control theory class in my senior year where we had to analyze reactors, distillation columns and disturbances in steady states: (i.e. fluctuations in P,T,x_{i} (liquid fraction compositions), and many more). I'm assuming the embedded systems that actually implement this could go into the curriculum too. I think my background in chemical engineering could poise us well to discuss lithography techniques and various quality control instrumentation.

[–] nickwitha_k@lemmy.sdf.org 2 points 1 year ago

Oooh! Very cool stuff. As one whose academic background was in chemistry, I really like that.

I've heard interesting things about the RISC-V architecture which have made me want to look more into it. Logic design is something that I feel isn't nearly touched upon as much as it should in other technical disciplines. My undergraduate engineering coding experience was complete shit for lack of a better term.

Yeah. I've been slowly working towards going through Learning the Art of Electronics but, too much has come up this year. RISC-V, coupled with artificial hardware shortages (scalpers), is really what got me interested in digital circuits, as I've been a FOSS proponent for years and have been gaining both annoyance at lack of innovation and effort to rectify supply shortages, as well as less and less comfortable with centralized control of hardware implementation (repairability, privacy, modifiability, and general ability to figure out how a thing works).

RISC-V, to me, seems to be solving a lot of that by encouraging more open approaches to ISA development. It isn't quite GPL level but, some of the most performant and general-purpose oriented implementations have been open-sourced by their implementers (ex Berkeley BOOM series of cores and even Alibaba's THead cores, which are likely to be present in servers by the end of the decade).

The biggest challenge, to me, is that there are not currently any great ways to make the ISA implementations democratized, accessible, and performant in modern use cases. The performance side is directly related to the two options for implementation: FPGA and custom silicon.

FPGAs are limited both in number of logic cells and physic due to the size and nature of their logic gates (bigger and further apart means slower and less efficient). For example, an implementation of a modified Berkeley SonicBOOM (BOOMv3) that is hardened against Spectre attacks takes about 115k LUTs per 100MHz core (my Xilinx 7-series FPGA, which is 4 generations old, only has an 85k LUT capacity, a cutting-edge FPGA chip costs thousands of dollars just for a bare chip).

Custom silicon is just insanely expensive and out of reach for most. The lithography equipment necessary for currently node processes is pretty much only available to multinational corporations. I'm going done alternatives that are "good enough" for modern workloads become more available.

I think that having a masterclass to get everyone on board with using Vim/Neovim/Emacs (Something at least that has much more versatility as an editor) as well.

Absolutely. That sounds like a worthwhile undertaking.

Embedded systems have had particular interest for me too. I took a control theory class in my senior year where we had to analyze reactors, distillation columns and disturbances in steady states: (i.e. fluctuations in P,T,x_{i} (liquid fraction compositions), and many more). I'm assuming the embedded systems that actually implement this could go into the curriculum too.

Indeed. I'm mainly self-taught there, with focus on hobbyist MCUs but the entry requirements both financial and knowledge have gone down significantly. An RP Pico W, a dual-core 32-bit ARM MCU with onboard WiFi and BLE can be had for $6. It can also be programmed on bare C, Arduino C, at least two implementations of Python, Rust, and a number of others, including drag-and-drop languages like MakeCode.

Definitely interested to see where this goes and contribute.