TheTrueLinuxDev

joined 1 year ago
[–] TheTrueLinuxDev@programming.dev 5 points 1 year ago* (last edited 1 year ago) (5 children)

This is not the first time it happens with Dotnet Open Source packages, there are some pretty funky things going on namely:

Imagesharp (They re-license from Apache 2 to something like Community/Commercial licenses and threw a huge fit over it)

Fody (It expects the software contributors of Fody to be a patron.)

[–] TheTrueLinuxDev@programming.dev 0 points 1 year ago (1 children)

It just rooted back to my frustration when I was trying to fill in missing implementation details on projects like Skia (at the time it lacked support for Vulkan.) My very fundamental core belief is that for core libraries like say, Skia, Neural Net Framework, and other crucial projects like that should offer a way in C API that allows every type and implementation to be extended upon by any other language that can interface with C API by providing your own VTable or whatnot.

One of the approach I do for my GUI Toolkit written in C (specifically on Linux to replace QT and GTK) was making a single inheritance object oriented programming in C.and if you insert the base class type structure at the top of your custom struct type and provide your own VTable for those objects, you can readily extend the underlying library natively in whatever programming language you use assuming it can talks to C API in a complete sense.

Let me know if you want a demonstration of this, I would be happy to find the time to set up a small sample to give you the idea on how it's done.

And I am also aware of the criticisms on those approach, verbosity of attempting to implement object oriented programming in C is kind of absurd and the API coverage would balloon. That is largely why I work on a Compiler-Generator Framework specifically to address the challenges by allowing me to add dialects on top of C Language such as generic, object oriented programming, and various dialects. I brought C closer to C# in term of syntax and features and at the end of compilation, it still produces readable C language code output and it also generates what I called an FFI-JSON. It's essentially a JSON file that describes all of the types used in a C project, the sizes of integers/floating points, structure types and it's fields/offsets/sizes comments, and function declarations. It's done in a way that you could read the JSON file and generate your programming language binding library saving you weeks of work.

[–] TheTrueLinuxDev@programming.dev 1 points 1 year ago (3 children)

I don't think so on the extensibility aspect alone, some of the Rust syntax/trait does not map well to other languages when other languages attempts to extend Rust library. I write C code in a way that it would be extensible from any languages.

[–] TheTrueLinuxDev@programming.dev 3 points 1 year ago (5 children)

Yep, biggest reason why I chose C language is Foreign Function Interface. Code you write for C is more than likely to be usable in just about any other languages.

[–] TheTrueLinuxDev@programming.dev 5 points 1 year ago* (last edited 1 year ago) (7 children)

I would most likely be using C11 for threads.h and stdatomic.h for foreseeable future, the problem with using the latest and greatest standard is the risk of compiler not supporting it, so I would likely wait at least 5 years before switching to C23 sometime in 2028 or 2029. There was a bit of a controversy around optional bound checking in C11 that they end up removing/deprecating it, I am sure C23 would have something going on.

I don't plan on using #embed or constexpr in favor of maintaining common C programming practices, language familiarity is still an important factor to thriving project as much as people nag on me to rewrite everything in Rust or C++.

[–] TheTrueLinuxDev@programming.dev 4 points 1 year ago* (last edited 1 year ago) (1 children)

Probably this script:

#!/bin/bash

if [ -z "$1" ]
then
        echo "Please provide git repository url as an argument for this script."
        exit 1
fi
regex='(https?|ftp|file)://[-[:alnum:]\+&@#/%?=~_|!:,.;]*[-[:alnum:]\+&@#/%=~_|]'
if [[ $1 =~ $regex ]]
then
basename=$(basename $1)
reponame=${basename%.*}
curl -X 'POST' 'https://localgitea.com/api/v1/repos/migrate?access_token={Access Token Here}' \
  --insecure \
  -H "accept: application/json" \
  -H "Content-Type: application/json" \
  -d '{  "clone_addr": "'"$1"'",  "issues": false,  "labels": false,  "lfs": false,  "mirror": true,  "mirror_interval": "96h0m0s",  "private": false, "repo_name": "'"$reponame"'", "pull_requests": true,  "releases": true, "repo_owner": "githubpublic",  "service": "git",  "wiki": true}'
else
        echo "Invalid URL"
        exit 1
fi

You can adjust it as needed and as for why I have --insecure flag, I have a direct network cable between my PC to the server, so encryption or HTTPS is not needed here. This is probably my favorite command, because I would write above as .sra.sh in home directory and then alias the .bashrc to make a sra command by adding alias sra=/home/{your user account}/.sra.sh in .bashrc and from there, anytime I have an interesting repository that I want to archive, I simply run sra {git url} and that's it. It also specify the mirror interval manually for 4 days interval rather than every 8 hours that would've needlessly spam the git server.

This is something I rely on everyday both as developer and system admin, I would maintain a different supply chain and prevent a supply chain attacks by generating my own package feeds/registry automatically from Gitea/Forgejo.

Edited to Add: I noticed this community is Powershell, here the powershell version of above:

param (
    [Parameter(Mandatory=$true)]
    [string]$gitRepoUrl
)

function Test-Url($url) {
    $regex = "(https?|ftp|file)://[-[:alnum:]\+&@#/%?=~_|!:,.;]*[-[:alnum:]\+&@#/%=~_|]"
    return $url -match $regex
}


$basename = Split-Path $gitRepoUrl -Leaf
$reponame = [System.IO.Path]::GetFileNameWithoutExtension($basename)

$headers = @{
    'accept' = 'application/json'
    'Content-Type' = 'application/json'
}

$body = @{
    'clone_addr' = $gitRepoUrl
    'issues' = $false
    'labels' = $false
    'lfs' = $false
    'mirror' = $true
    'mirror_interval' = '96h0m0s'
    'private' = $false
    'repo_name' = $reponame
    'pull_requests' = $true
    'releases' = $true
    'repo_owner' = 'githubpublic'
    'service' = 'git'
    'wiki' = $true
} | ConvertTo-Json

Invoke-RestMethod -Uri 'https://localgitea.com/api/v1/repos/migrate?access_token={Access Token Here}' -Method POST -Headers $headers -Body $body -SkipCertificateCheck

Because code that they released to public are usually MIT licensed like Dotnet Core Runtime. They just have a long history of hating GPL licensed software.

I tried to use it, but it have some big issues in reliability, because at the end of the day, despise the dataset it's trained on, it's still something I describes as a "language interpolation."

It sometime make TERRIBLE recommendations for which tools/libraries I should explore, because it assumes that those libraries might have support. Those libraries never does and so I wasted weeks on it. (It doesn't help that both code and project are undocumented.)

So after that experience, I demote ChatGPT usefulness to just "cleaning up pre-written documentation so it sounds better." That's it.

view more: ‹ prev next ›