When I do commit, I write up the title of what I did, and describe it, and then use periods for related commits. Just easier.
When I do commit, I write up the title of what I did, and describe it, and then use periods for related commits. Just easier.
I’m just glad I have other options than just Python. Am not afraid of writing my solutions either. I rarely use Python these day.
For small projects, rewriting is often superb. It allows us to reorganize a mess, apply new knowledge, add neat features and doodads, etc.
This. I’m coding to contribute to a open-source software with very small amount of coders, and with a non-mainstream Domain-Specific Language. A lot of the code I did before has been proven to work from times to time, but they all could benefit from better outputs and better GUI. So, I end up reengineering the entire and that’ll take a really long time, however, I do a lot of tests to ensure it works.
I don’t understand your problem well enough to know, if you can (or want to) use this here, but you might be able to tap into that C performance with the radix conversion formatting of printf.
The problem is printing big binary to decimal. That’s not a easy problem because 10 is not a power 2. If we live in a base-hex world, this would be very easy to solve in O(n).
Also, I can’t access that as G’MIC is a language that can’t really communicate with other language as it’s not meant to share memory.
This could be an XY problem, that is, you’re trying to solve problem X, rather than the underlying problem Y. Y here being: Why do you need things to be in decimal in the first place?
I wouldn’t say it’s needed, but this is more of a fun thing for me. The only thing I’m using this is for Tupper’s Self-Referential formula, and my current approach of converting base 1>>24 to base 1e7 works instantly for 106x17 binary digits. When I load a image to that filter that’s greater than somewhere over 256x256, delays are noticeable because the underlying algorithm isn’t that great, but it could have to do with the fact that G’MIC is interpretative, and despite the JIT support in it, this is not the kind of problem it’s meant to solve (Domain-Specific). On the bright side of thing, this algorithm will work with any data type as long as one data type is one level higher than the other, and in this case, I’m using the lowest level (single and double), and the bigger data type, much faster it can be.
I don’t think we do have a difference in opinion. What I’m saying is that some apps are done with many years of development, and in those case, C++ will likely be the only realistic option because it is way more time-consuming to switch. For example, Krita. I do agree that when there’s a choice, C++ is less relevant these day.
C++ is still used for some popular applications, and it still is the only realistic option for these ones. I think there should be more Domain-Specific Languages. I want one for vector graphics like G’MIC is for raster graphics.
I been meaning to learn Ruby to get around using Python. I like Ruby syntax better.
Coming from some one who used 4 different languages (C#, C++, Python, and G’MIC), I just feel more comfortable when there’s a explicit end blocks, which is why I don’t like Python. Of all of those languages, only Python does not make that explicit end block which is off-putting in my opinion, and there isn’t any other options with the similar role to Python.
You mean a interpretative language with similar role to Python, but more like Rust/C++ style? I actually want that so that I can ditch Python even if I learned it and use this instead.
This is great, even though if I code in Python, I’m not using it for performance reason, but for convenience.
I kind of like it. I can understand where it start and end.
Chances are there’s probably something similar to dictionary in Python in your languages or at least it’s a import/#include away. Although I don’t use general programming languages at all, in my used language (G’MIC), I do something like dict$var=input
where $var
is a defined variable, and this way I can access input by doing ${dict$var}
and that’s similar to Python dictionary. In C++, there’s hash table implementation out there via github. That being said, there are sometimes when you don’t need a hashtable dependent on the hashmap, and sometimes, it’s just as simple as basic mathematics to access data.
Seems like a good idea, I’m hoping that the syntax is sane. As far as languages goes, I think you’re missing out on G’MIC to compare as it does have things like FFT and other tools all for image processing which is just part of digital signal processing. And then, there’s Python with libraries and so on.
This is what I prefer too! I also some times prefer to use bitshift when it comes to division or multiplication of power of 2.
I only stick with these:
Easy.
For raster graphics image processing, I’d highly recommend G’MIC. Otherwise, Python and especially for string using regex library. I wish there was a vector graphics version of G’MIC.
I only do raster graphics image processing, so G’MIC it is. A entire coding language and it’s a library in of by itself for that.
On non-DSL, don’t have a fave. I’ll choose one of these: Python, C++, C#.
Every languages has their own pitfalls. The answer on picking a language is to pick whatever works for you. There may be even domain-specific languages if you’re interested in a domain, and it can be way more flexible than general-purpose solutions for that domain too.
I use 4 languages.
Paint.NET
). Kinda similar purpose to what I do with G’MIC, except so much more limited.Now, I wish there was a vector equivalent to G’MIC, but there isn’t.
Scala does look nice. Just a quick syntax view makes me want to give it a whirl when I want an alternative to Python. I used to code in C++, and C#. I use G’MIC (DSL) as my main. Scala seems right up my alley.