VOGONS


What modern activity did you get up to today?

Topic actions

Reply 1560 of 1580, by lti

User metadata
Rank Member
Rank
Member

I ended up not installing Linux on my laptop last week. I'm second-guessing my distro choices. Maybe I should run something Ubuntu-based (Kubuntu or give Mint a second chance - I don't know why it was so unstable on that one computer) instead.

Also, I learned that virt-manager passes your clipboard history to the guest OS.

Reply 1561 of 1580, by GigAHerZ

User metadata
Rank Oldbie
Rank
Oldbie

@Iti, as a life-long windows user who has only deployed software to linux servers, but has kept himself away from devops work, I found Fedora KDE surprisingly fluent for me.

Just 2 cents or additional single datapoint for making your decision process more complicated. 😁

"640K ought to be enough for anybody." - And i intend to get every last bit out of it even after loading every damn driver!
A little about software engineering: https://byteaether.github.io/

Reply 1562 of 1580, by lti

User metadata
Rank Member
Rank
Member

I was planning on Fedora KDE, but Copr doesn't seem to have as much in it as Ubuntu PPAs. That probably isn't a problem for me, but it was enough to make me hesitate. On EndeavourOS (based on Arch), I've only used the AUR for AviSynth+ and plugins, which was a miserable failure (an extreme case of dependency hell). On Ubuntu, I was using PPAs for newer versions of packages than the official repository, which won't be a problem for Fedora.

For AviSynth+, the easy way out would be to try running it in Wine or switch the big Ryzen to Windows. That CPU is a waste as a web browsing machine anyway, but it's also currently located in a room next to the non-climate-controlled garage. I don't like how Windows has background processes that make the CPU hit its power limit (88W, but with the I/O die and motherboard peripherals added on, total system power consumption hits 130W) while Task Manager shows no significant CPU usage (even in the individual core chart).

Reply 1563 of 1580, by 386SX

User metadata
Rank l33t
Rank
l33t

Tried the PWM adapter GPU cable to connect the Arc A310 to the mainboard fan system. The trick works of course but even the sound of a fixed RPM speed is worse than the variable sound of the original video card bad fan management. At 30-40% speed (and already much noise is generated) with a GPU bench the GPU temp get high fast passing 70°C in few minutes.. the UEFI bios doesn't see the PCIE card sensor so it cannot be used on my Gigabyte mainboard.
So I got back to its own PWM connector and will wait for another useless firmware upgrade to see if things get better.

Reply 1564 of 1580, by UCyborg

User metadata
Rank Oldbie
Rank
Oldbie

With recent updates, the petrol station app broke on my phone, couldn't get anywhere after login. Recording the screen revealed the WebView complained about unknown URL scheme. This was only seen momentarily before it closed and being back on the screen with login button (which I think is native, not part of web page). Seeing the app working on another phone, I suspected at first that maybe it doesn't like either the older version of WebView or maybe it's tripping on something related to modding (rooting and such). Updating WebView didn't solve it. Then I got the idea and tried disabling Google Chrome on another phone. That broke the app.

So the issue is actually connected to me not having any "normal" web browser on the phone. I tend to use Via that uses WebView. And Chrome is disabled.

I striked the conversation with ChatGPT. Without telling it specifically about the app I'm dealing with at first and just if it could be fixed by some simple modification to AndroidManifest.xml, it said this is typical of apps utilizing OAuth login method (that was without me telling it "oauth" word is actually in the problematic URL), where the server sends the URL with app's own scheme rather than HTTP / HTTPS, which Chrome can handle and can callback the app that registered the scheme, but fallback to WebView fails since it can only handle HTTP / HTTPS and could only work with properly programmed overridden shouldOverrideUrlLoading() method in the app to bypass WebView in this instance.

Long story short, besides suggesting the obvious easiest solution of installing a "normal" web browser, it offered to guide me towards fixing application. I agreed and since I already told it I have the app decompiled with apktool, its first suggestion was to search for one of 3 keywords, one being that well known shouldOverrideUrlLoading() method, which I picked first. I showed it 3 instances of that method, first two were easy and short, so it gave back the rewritten method checking for app's URL scheme and starting the activity if it's not HTTP / HTTPS URL.

I tried each one and the app's behavior didn't change. 3rd method I found was more complicated, 688 lines including blank lines when copied to Notepad. The AI figured this must be the real location where things go wrong, seeing checks for HTTP and HTTPS. It gave a code snippet with instruction where exactly it should be inserted (and to be careful to not disrupt surrounding code). I tried it, rebuilt the app, signed it, installed on the phone and to my surprise, login worked this time!

The next day, I could successfully pay for petrol at the petrol station with the patched app. Mind == blown!

I did not expect it would find a solution to such obscure problem. And Smali is particularly alien language (to me), being something intermediate below Java.

Arthur Schopenhauer wrote:

A man can be himself only so long as he is alone; and if he does not love solitude, he will not love freedom; for it is only when he is alone that he is really free.

Reply 1565 of 1580, by Ahrle

User metadata
Rank Newbie
Rank
Newbie

AI! Lol. Trying to learn Access 2010 to build a database of the collection...

Also used a Win10 laptop to hunt down setup disks for a recently purchased DeskPro.

Current main: Inspiron 8100, Tualatin 1133, 512MB, GF2 Go, 1600x1200, dualboot 98/XP.

Reply 1566 of 1580, by UCyborg

User metadata
Rank Oldbie
Rank
Oldbie
Ahrle wrote on 2026-03-17, 03:00:

AI! Lol.

I don't know, AI just got it right. Maybe hacking random smartphone apps to not need Chrome / Firefox is common knowledge and I'm the stupid one who's behind.

Ahrle wrote on 2026-03-17, 03:00:

Trying to learn Access 2010 to build a database of the collection...

Access would be my first idea if asked couple of years ago.

What about SQLite for these sort of things? Just because it's this universal easy quick to get going database. Though I faintly remember Access having its own bells and whistles, like forms with dropdowns and such. Making a database with something like SQLiteStudio, if you have a table with foreign key, you'd have to enter id for the record to refer to the record in another table.
---
Feeling weird, mouth flooded with saliva, was at dental surgeon, rotten wisdom tooth...ugh.

Next time, I think I'd just rather go to the morgue.

Arthur Schopenhauer wrote:

A man can be himself only so long as he is alone; and if he does not love solitude, he will not love freedom; for it is only when he is alone that he is really free.

Reply 1567 of 1580, by GigAHerZ

User metadata
Rank Oldbie
Rank
Oldbie

+1 on SQLite - beautiful piece of technology.

To be honest, it should be default choice in general and whenever you go with any "classic" server-client sql solution, you should have specific requirements demanding this approach.
Okay, rant over. 😁

"640K ought to be enough for anybody." - And i intend to get every last bit out of it even after loading every damn driver!
A little about software engineering: https://byteaether.github.io/

Reply 1568 of 1580, by Ahrle

User metadata
Rank Newbie
Rank
Newbie
UCyborg wrote on 2026-03-17, 10:28:

What about SQLite for these sort of things? Just because it's this universal easy quick to get going database. Though I faintly remember Access having its own bells and whistles, like forms with dropdowns and such. Making a database with something like SQLiteStudio, if you have a table with foreign key, you'd have to enter id for the record to refer to the record in another table.

Is sqlite better? I only know Access, or well, the name after it stayed there unused for a dozen years 😅

Current main: Inspiron 8100, Tualatin 1133, 512MB, GF2 Go, 1600x1200, dualboot 98/XP.

Reply 1569 of 1580, by UCyborg

User metadata
Rank Oldbie
Rank
Oldbie

It was more of a general question for the audience as I'm not sure about "better", just "different".

Arthur Schopenhauer wrote:

A man can be himself only so long as he is alone; and if he does not love solitude, he will not love freedom; for it is only when he is alone that he is really free.

Reply 1570 of 1580, by 386SX

User metadata
Rank l33t
Rank
l33t

Today I'm testing old Windows games and benchmarks on the Raspberry Pi 5 SBC (Cortex A76 ARM quad 2,4ghz) using the Box64 app and Wine. Beside having to spend more time into the configuration of resolutions and virtual desktops it really works faster than expected. Even 3DMark05 can run faster than probably the Radeon 9700 ran in its time, of course considering the layer translation involved into the different o.s., cpu architecture and OpenGL to D3D.

Reply 1571 of 1580, by 386SX

User metadata
Rank l33t
Rank
l33t

Thief2 running on the Raspberry Pi is so great. Patched with T2Fix on the original game for Directx9 faster rendering.

Reply 1572 of 1580, by newtmonkey

User metadata
Rank Oldbie
Rank
Oldbie

After many years I finally got a new gaming laptop. I understand that it's a waste of money to do this, because it would be much better to spend that money on a much nicer gaming desktop, but I just like laptops. I also use it for work, so it's nice to be able to just carry around the same system if I'm traveling for work, or whatever.

Anyway, it's a massive upgrade over my previous system, which was top-of-the-line back when I got it many years ago, was surprisingly able to keep up over the years, but really started to struggle to maintain a decent fps even at 1080p with modern games.

I also got a surprisingly cheap Dell FreeSync monitor, and it's amazing. I initially was not expecting much; after all, if the new PC can maintain 60 fps, would I really need it? Well, it's just awesome. Playing older FPS games at 100-200 fps is amazing, and playing games on DOSBOX at 70Hz really helps make it feel authentic; I actually find myself turning to DOSBOX more often than my dedicated DOS machine. It's also nice to not have to worry about vsync at all for most games, with no tearing and no input lag as long as the game can maintain 48 fps or so. It's a game changer, even if the game somehow doesn't work with freesync for some reason. It's completely transformed my PC gaming experience.

Reply 1573 of 1580, by 386SX

User metadata
Rank l33t
Rank
l33t

Some update on the Raspberry Pi 5 running old benchmarks and games for Windows: early benchmarks considering the x86>ARM64 translator and accelerating by its V3D 7.1.7 GPU (using D3D to GL to EGL translation) results in these early results:

3DMARK2001 @ default

RESULTS
3DMark Score 9613
Game 1 - Car Chase - Low Detail 99.8 fps
Game 1 - Car Chase - High Detail 41.6 fps
Game 2 - Dragothic - Low Detail 165.2 fps
Game 2 - Dragothic - High Detail 94.6 fps
Game 3 - Lobby - Low Detail 110.4 fps
Game 3 - Lobby - High Detail 55.5 fps
Game 4 - Nature 101.2 fps
Fill Rate (Single-Texturing) 1775.7 MTexels/s
Fill Rate (Multi-Texturing) 2905.8 MTexels/s
High Polygon Count (1 Light) 98.3 MTriangles/s
High Polygon Count (8 Lights) 81.1 MTriangles/s

3DMark05 @ default instead arrive at a good 3300 score. There's space for serious improvements considering using OpenGL instead of the Vulkan drivers that seems to work faster but break some geometry and textures (at least tried in Thief 2 game). I'll try to take screenshots but now I'm not using Virtual Desktops option in Wine (as used above) so doesn't seems to work (it results in faster rendering more similar to a real x86 enviroments with less latency, smooth rendering).

Reply 1574 of 1580, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

I've been testing various DLSS 4.0 modes in Clair Obscur: Expedition 33, to see what works best on a 1080p screen, while fully maxing out all other visual settings (Epic preset). Surprisingly, I found that running the game at 1440p with DLSS Performance provides better temporal stability than 1080p with DLSS Quality. In both cases, the internal render resolution should be 720p, but somehow, it still looks better in the first scenario. Also, GPU utilization is around 15% higher in that case.

The attachment E33_1080p_DLSS_Quality.jpg is no longer available
The attachment E33_1440p_DLSS_Performance.jpg is no longer available

During the intro cutscene, notice the black artifacts in the hair strands on the right side of Maelle's face at 1080p + DLSS Quality. It looks even worse in motion, as those artifacts keep flickering and moving around. Yet they are completely absent at 1440p + DLSS Performance. I even tested 1080p + DLAA and that still had those artifacts. I have read that DLSS can produce suboptimal results when used at 1080p, but this is the first game where I've seen such a drastic difference. In Expedition 33, several party members have long, flowing hair, so this is very noticeable.

My retro builds

Reply 1575 of 1580, by BetaC

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2026-03-24, 13:32:
I've been testing various DLSS 4.0 modes in Clair Obscur: Expedition 33, to see what works best on a 1080p screen, while fully m […]
Show full quote

I've been testing various DLSS 4.0 modes in Clair Obscur: Expedition 33, to see what works best on a 1080p screen, while fully maxing out all other visual settings (Epic preset). Surprisingly, I found that running the game at 1440p with DLSS Performance provides better temporal stability than 1080p with DLSS Quality. In both cases, the internal render resolution should be 720p, but somehow, it still looks better in the first scenario. Also, GPU utilization is around 15% higher in that case.

The attachment E33_1080p_DLSS_Quality.jpg is no longer available
The attachment E33_1440p_DLSS_Performance.jpg is no longer available

During the intro cutscene, notice the black artifacts in the hair strands on the right side of Maelle's face at 1080p + DLSS Quality. It looks even worse in motion, as those artifacts keep flickering and moving around. Yet they are completely absent at 1440p + DLSS Performance. I even tested 1080p + DLAA and that still had those artifacts. I have read that DLSS can produce suboptimal results when used at 1080p, but this is the first game where I've seen such a drastic difference. In Expedition 33, several party members have long, flowing hair, so this is very noticeable.

Given that 1440 is an multiple of 720, it makes sense. It also makes sense that the secret sauce with filtering the hell out of upscaling to hide that it's upscaling benefits from simple multiplication.

rfbu29-99.png
s8gas8-99.png
uz9qgb-6.png

Reply 1576 of 1580, by StriderTR

User metadata
Rank Oldbie
Rank
Oldbie

I was able to get a Bitaxe 601 exceptionally cheap (under $20) to play around with. Got it flashed, setup, and running at about 1.2 TH/s.

Thinking of trying to OC it. I've got most of what I need on hand. Just to see how fast I can make it go, becasue, why not. I should find a block in about 15,000 years, give or take a few centuries.

Yes. This is just for the giggles! 😀

DOS, Win9x, General "Retro" Enthusiast. Professional Tinkerer. Technology Hobbyist. Expert at Nothing! Build, Create, Repair, Repeat!
This Old Man's Builds, Projects, and Other Retro Goodness: https://theclassicgeek.blogspot.com/

Reply 1577 of 1580, by 386SX

User metadata
Rank l33t
Rank
l33t

Today testing my Intel Arc 310 ECO with the new Ubuntu 26.04 (beta) and beside still no hope for a fan curve management of the GPU, still I'm impressed that even in Ubuntu now finally the chromium based browsers can fully use the AV1 video hw decoder of the card in YouTube. CPU almost free (5-10%) even with 60fps highest resolution videos while the GPU video hw engine occupied at up to 10% usage. Before when running in software decoding the video hw engine was always at 0% of course.
If it wasn't for the awful fan problems this would be such a nice video card for office and light gaming.

Reply 1578 of 1580, by darry

User metadata
Rank l33t++
Rank
l33t++

Bought 16GB of DDR4 3200 for 169.99 CAN$ before taxes, for a friend's no longer delayable PC upgrade. Hint: Old CPU is Xeon e5450 on a P35 board.

Reply 1579 of 1580, by Nexxen

User metadata
Rank l33t
Rank
l33t
darry wrote on Today, 01:39:

Bought 16GB of DDR4 3200 for 169.99 CAN$ before taxes, for a friend's no longer delayable PC upgrade. Hint: Old CPU is Xeon e5450 on a P35 board.

Not so long ago it would have been 69,99.
Well, P35... it was really about time 😀

PC#1 Pentium 233 MMX - 98SE
PC#2 PIII-1Ghz - 98SE/W2K

- "One hates the specialty unobtainium parts, the other laughs in greed listing them under a ridiculous price" - kotel studios
- Bare metal ist krieg.