ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
Updated
Dec 14, 2024 - C++
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks
QuestDB is a high performance, open-source, time-series database
OpenGL Mathematics (GLM)
A blazingly fast JSON serializing & deserializing library
Performance-portable, length-agnostic SIMD with runtime dispatch
The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologies.
🚀 efficient approximate nearest neighbor search algorithm collections library written in Rust 🦀 .
Open source c++ skeletal animation library and toolset
📽 Highly Optimized 2D / 3D Graphics Math (glm) for C
Fast Open-Source Search & Clustering engine × for Vectors & 🔜 Strings × in C++, C, Python, JavaScript, Rust, Java, Objective-C, Swift, C#, GoLang, and Wolfram 🔍
Up to 10x faster strings for C, C++, Python, Rust, and Swift, leveraging NEON, AVX2, AVX-512, and SWAR to accelerate search, sort, edit distances, alignment scores, etc 🦖
C++ wrappers for SIMD intrinsics and parallelized, optimized mathematical functions (SSE, AVX, AVX512, NEON, SVE))
Inference Llama 2 in one file of pure 🔥
Add a description, image, and links to the simd topic page so that developers can more easily learn about it.
To associate your repository with the simd topic, visit your repo's landing page and select "manage topics."