VOGONS


Reply 20 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie
Falcosoft wrote:

Problem with Checkit: As I said in some cases Checkit can report video bios + option ROM together as one continuous occupied area labelled as 'Unknown ROM'.
In this case the real Video bios size is actually less than 64 KB but AHCI option ROM is enabled. Someone can misunderstand this and falsely believe that in this case video bios is 88 KB.

Ok thanks for explanation, i already simulated this problem and fixed some reported sizes in first top, i will slowly check others.
BTW do you think that its possible from Firmware with datasheet help read its real DOS GPU / VRAM clocks for modern cards which have more than 1 power / clock states? What about VRAM size, MTRR from Rayer is during execution reporting some modern cards as 4 MB, others 14 a or 16 MB? Make this difference, its it ROM design thing too?

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 21 of 54, by Falcosoft

User metadata
Rank Oldbie
Rank
Oldbie
ruthan wrote:

BTW do you think that its possible from Firmware with datasheet help read its real DOS GPU / VRAM clocks for modern cards which have more than 1 power / clock states? What about VRAM size, MTRR from Rayer is during execution reporting some modern cards as 4 MB, others 14 a or 16 MB? Make this difference, its it ROM design thing too?

On Windows some BIOS editors can show you DOS GPU/Memory clocks. DOS clocks are usually can be found as Boot/Bootup clocks.

DosClocks.jpg
Filename
DosClocks.jpg
File size
133.77 KiB
Views
2061 views
File license
Fair use/fair dealing exception

Under DOS you can use standard VESA function 0 'VBE Controller Information' to get 'TotalMemory' member in ''VbeInfoBlock' structure. But on modern cards this function reports usually only 16 MB as available for VESA modes.
The last cards that reported real memory size for VESA were 256 MB cards in Geforce 6/7 era.
I think Rayer's program use this VESA function to get memory size, too.
Yes, it's a similar 'ROM design thing' for compatibility reasons.

Website, Facebook, Youtube
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
VST Midi Driver Midi Mapper

Reply 22 of 54, by Baoran

User metadata
Rank l33t
Rank
l33t

Cardex Tseng ET4000AX ISA: 32Kb reported by NSSI
S3 Vision 864 PCI: 32Kb reported by NSSI

Then there is this card:

tsenget3000ax.jpg
Filename
tsenget3000ax.jpg
File size
1.92 MiB
Views
2060 views
File license
Fair use/fair dealing exception

I don't know the brand, but it has Tseng labs ET3000AX chip. It has the smallest bios I have seen so far which is 24Kb. I checked it with both debug and vbsize to make sure.

Edit: I will leave this post here because there are probably other cards with the same ET3000AX chip that might have different bios.

Reply 23 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

Pascal / C thing..

FalcoSoft wrote:

I don't know where you got the information from that Pascal does not have pointers. It always has full pointer support even in Turbo Pascal days:
https://www.tutorialspoint.com/pascal/pascal_pointers.htm

I / we never used them in Pascal, in C they from the start seemed to be essential and used by lots of inbuilt stuff.

FalcoSoft wrote:

I don't think you have understood my original sentence: It was "With hand made optimized assembly routines at key parts a Pascal program's execution speed can be much faster than pure C code."
Of course you can use assembly routines in C too. No one said that 'ASM part would be faster in Pascal'. I talked about pure C code. So my argument was that you could get the biggest performance gains when you used hand made assembly optimized routines at key parts of the program and not that you chose C instead of Pascal. You could not save using assembly if you wanted best performance regardless of using Pascal or C.
And Turbo Pascal had no disadvantage at this part.

I think that i understood, but its still seems to be apples to oranges.. write lots of code in ASM wouldnt be effective from development time perspective, unless you would be really very good with ASM.

FalcoSoft wrote:

And of course there is Free Pascal...

How much is still supported, alive in comparison to C stuff? Its possible with it still compile programs for DOS, but develop it on modern Windows, to enjoy power of multiple monitors / internet / virtual target machines etc?

FalcoSoft wrote:

Delphi is still alive (even with 64-bit support) and arguably is still a viable choice for native Win32/Win64 desktop programming.
Delphi's problem is not performance but bad business choices (there is no free/community version, and each version could cost you a fortune twice a year) and real multi platform support. IOS/OSX/Android dialect of Object Pascal is different from Win32/64 dialect and you cannot use the known Win32/64 visual controls (VCL) in mobile development. So the 'write once, compile everywhere' motto of Delphi is not true in reality.

Who develop / support it now? I heard maybe 10 years ago that Borland was bough by some other company.
Could you use modern stuff like DirectX/3D 9/10/11, Vulkan and other stuff with it? Or its simply old stuff which is could run on modern OSes?
Borland made lots of strange business decisions, from time when Visual Studio started to have free versions, they had to know than they have to do it too, or they will loose the market..

FalcoSoft wrote:

I have written many performance critical programs/libraries in different Pascal developing environments (of course also using some assembly optimizations at key parts) and all of them have been at least as fast as implementations written in C/C++ environments.

Lets say that you are right about speed.. Why you think that majority of developers even back in day moved to C and use Pascal?

FalcoSoft wrote:
Sorry for not reflecting the other 'modern programming' parts of your post but it's out of context for me. Personally I have ne […]
Show full quote

Sorry for not reflecting the other 'modern programming' parts of your post but it's out of context for me.
Personally I have never written (and never will) AAA+ games so I have no experience in this area.
But I have written many performance critical programs/libraries in different Pascal developing environments (of course also using some assembly optimizations at key parts) and all of them have been at least as fast as implementations written in C/C++ environments.
Nowadays at work I'm programming exclusively in environments that use some kind of C dialect languages but I miss Pascal and low-level assembly every day 😀
Also (partly) I'm visiting a retro forum like this because I'm a little bit fed up with modern programming trends. So I'm not the best choice to discuss Java, web development, .Net stuff anyway.

Games development is interesting, because was always performance heavy, from old times there were lots tricks involved to make games happen. It again gained importance with mobile gaming, when you once again have quite limited resources (bad sadly often limed / no low-level access to engine / OS / HW). People can dream about magic optimizations whole day, but for example Nintendo Switch is too slow some todays PS4 / Xbox games even ported to smaller resolutions and with much less polygons models, developes had very hard time to make old Bioshock iPad port, even after it, it worked maybe year, two and after said, sorry we will not support more modern iOS versions.
I have no much illusions even about development for Science, where should performance maters a lot, i have possibility to saw lots bad stuff there. Im sure that there some exceptions, but in average i dont thing that performance optimization level is same or better than in games, where you simply see performance / Developers skill translated to graphics fidelity and its hard to cheat it.. but is has some negative aspect too, that games are now more and more expensive and more and more about graphics and less and less about anything else.

Todays no games development, im part of it too, at least part-time. We have lots of raw power to burn and its more about how to safe development time than about anything else maybe except some "sexy" GUI (which is usually designed by customer / other companies etc..), lots of components, APIs and layers where you loose performance. I saw people doing simple calculation through some WebService calls, you can kill whole performance just by bad settings of WS/AP server, there are DB jobs / procedures which runs 100x slower and consume lots of resources than they could and nobody care. I saw very expensive machines which ran DBs with default settings good for desktop etc.
Its more about design and than coding and after it is implemented, you usually fix only some very critical performance problems, but its far from old lowlevel optimizations.

I would say that probably +1 level for optimizations would be games machines emulators development, it looks sometimes very low level.

Last edited by ruthan on 2018-11-04, 10:36. Edited 1 time in total.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 24 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie
Falcosoft wrote:

On Windows some BIOS editors can show you DOS GPU/Memory clocks. DOS clocks are usually can be found as Boot/Bootup clocks.

This Windows stuff is nice too, but i wanted to know if such info is available even in DOS in theory, or is inaccessible from it.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 25 of 54, by Falcosoft

User metadata
Rank Oldbie
Rank
Oldbie
ruthan wrote:

This Windows stuff is nice too, but i wanted to know if such info is available even in DOS in theory, or is inaccessible from it.

I don't think such information is available in DOS from the Bios ROM area at 0xC0000. This kind of info would occupy valuable space without any advantage since it's never needed in DOS. Of course in theory a DOS executable can be written that could extract this information e.g. from full firmware dump files (similarly to Windows based Bios editor utilities) but I do not find much sense in doing this.

I / we never used them in Pascal, in C they from the start seemed to be essential and used by lots of inbuilt stuff.

Then your Pascal training certainly was not too deep 😀 Pointers are first class citizens in Pascal similarly to C/C++ and e.g. all dynamic memory allocations and dynamic variables use pointers:
C -> Pascal
malloc() -> GetMem()
free() -> FreeMem()
C++ -> Pascal
new() -> New()
delete() -> Dispose()

I think that i understood, but its still seems to be apples to oranges.. write lots of code in ASM wouldnt be effective from development time perspective, unless you would be really very good with ASM.

I do not think so, but maybe it's my mistake. So I try a more detailed explanation. For me at least it's much easier and faster to build the frame/base/event handling/GUI related parts of a project in Object Pascal than in C/C++ (not to mention faster debug/modify/recompile cycles). For most desktop programs it's true that performance critical parts are only a smaller part of the whole program (say e.g. 10%). The other parts are rather user interaction handling for setting parameters etc. where even a Basic interpreter's code would be fast enough. Overall development time for a whole program is not more but less since you can write the frame/base/GUI parts (that represent 90% of code) faster and so you have more time left to optimize the speed critical parts even with hand made assembly code. But this approach also has a disadvantage: namely assembly is inherently platform dependent. So for multi platform projects it's not working and a fast C/C++ compiler is a better choice.
So for multi-platform/mobile development I would never choose Delphi. But for Win32/64 development even for new projects I still choose it.

How much is still supported, alive in comparison to C stuff? Its possible with it still compile programs for DOS, but develop it on modern Windows, to enjoy power of multiple monitors / internet / virtual target machines etc?

Free Pascal (and the Lazarus IDE) is in still active development and not only for 'legacy' platforms:
http://wiki.freepascal.org/Platform_list
https://www.lazarus-ide.org/index.php?page=downloads

It supports anything that is available on modern Windows( including multi-monitor, internet, etc.)
The question is why not? More precisely why do you think it cannot? It seems you miss the point how native Windows compilers work. Windows is a bunch of libraries. These libraries (core functions in kernel32.dll, user32.dll, gdi32.dll, comctl32.dll, comdlg32.dll , ole32.dll etc.) export the functions a program can use. If a new feature is added to Windows it means either a new library or a new function in an existing library. Even an older compiler that was written long before these new Windows features added can use them, only a new import unit is needed for the new functions.
There is no new MS supplied midleware/CLR runtime needed, you just need to know the interface (parameters etc.) of the new functions from MS documentation and you have to write a proper import for them (of course newer versions of FreePascal/Delphi compilers include the import files and components necessary for new features built-in).
The compiled executable is a real x86/x64 binary that can talk directly to Windows libraries without an intermediate runtime.
An example:
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
http://falcosoft.hu/softwares.html#midiplayer
The 32-bit version of my most famous program here on Vogons is still maintained in Delphi5 (released in 1999). It can run on anything from Win9x to Windows 10 (without any other dependency that you would have to download separately) and it can also use advanced features of later Windows versions (XP themes, Win 7+ interactive Taskbar thumbnails, WASAPI output, etc.) despite the fact that when the compiler was released even Win XP did not exist. Also it uses Directsound, VST technology and many other things that stock Delphi never supported (and never will). It's possible since you can use any features of Windows or other libraries if you download or make yourself a proper port of an SDK or import unit (and write a class/component around it).
Also the Munt VSTi x86/x64 plugin that interfaces with the Munt C library is written in Delphi. I have not found a working Pascal VST 2.4 SDK port (not to mention 64-bit) so I had to write it myself but it was a pleasure 😀. It's included in MidiPlayer's package but also can be downloaded separately:
http://falcosoft.hu/softwares.html#munt_vsti

Who develop / support it now? I heard maybe 10 years ago that Borland was bough by some other company.
Could you use modern stuff like DirectX/3D 9/10/11, Vulkan and other stuff with it? Or its simply old stuff which is could run on modern OSes?

It' called Embarcadero:
https://www.embarcadero.com/
Of course you can use modern features. See my answer above: You even do not need the latest and greatest versions for it:
For Directx 9:
https://sourceforge.net/projects/delphi-dx9sdk/
For Directx 10-11:
http://www.jsbmedical.co.uk/DirectXForDelphi/
For Directx 12:
https://github.com/CMCHTPC/DelphiDX12
For Vulkan:
http://git.ccs-baumann.de/bitspace/Vulkan

Anyway it's good you asked since I have noticed there is a new free/community version released:
https://www.embarcadero.com/products/delphi/starter

Lets say that you are right about speed.. Why you think that majority of developers even back in day moved to C and use Pascal?

Don't just believe it because I said so. Try it and see for yourself:
http://falcosoft.hu/index.html#mandelx

I think it's mainly because if you do not use the developer tools of the platform owner you are a 2nd class citizen. You have to putter with many things and make many things yourself. But once it's done it's yours (even for later projects) and it works the way you like it. If you use the platform owner tools (Windows ->MS Visual Studio, Android -> Google Android studio, IOS, OSX -> Apple Xcode ) you get everything ready-made. But being a 2nd class citizen has advantages too: When you have to write your own stuff instead of getting it ready you will have a deeper understanding of how the used API works and overall how your whole project works.
So for my hobby projects I never feel it as a disadvantage. In my free time I enjoy tinkering/optimizing or writing my own ports from C code or import units/components from scratch. There's no rush to market 😀.
And contrary to my average company projects (that nowadays are rather Lego-ing/Gluing and more than 50% of the project's code is not written by the project's programmers but downloaded by Nuget, Npm etc.) I really know what part does what and when a bug is reported I have a much better chance to fix it myself without having to wait for someone else since it's my own code.

@Edit: Also here are relatively new comparisons (2015,2017) of Delphi, C/C++, C# integer speed (do not trust them fully, results can change depending the task). But the point is there is no "slow" contender anymore. Today's compilers, even C#/Java produce fast code.
https://codingforspeed.com/integer-performanc … for-c-c-delphi/
https://helloacm.com/integer-computation-effi … elphi-java-c-c/

And a little bit older (2011) but more thorough university research result:
https://www.ijcaonline.org/volume26/number1/pxc3874199.pdf

More Google results:
https://www.google.com/search?q=delphi+vs+C%2B%2B+benchmark

Website, Facebook, Youtube
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
VST Midi Driver Midi Mapper

Reply 26 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

We have first result which shows that card manufactor actually matters, my Microstrat Geforce 7600 GT reports Vbios size 59KB, Falcosofts Inno3D only 55 KB. My cards its PCI-E with 2x DVI + S-Video, maybe it matters too.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 28 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

+1 for VBsize, few days ago i discovered that Navratil System info has Video Bios size info too and its always the same as shows VBsize, but i tried to use it for my X58 machine and its shows at the start, too many PCI devices error and probably its not designed to work properly on system with multiple videocards or primary videocard bios selection, so its showing PCI videocard info even when its not primary VBsize is working fine.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 29 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie
Falcosoft wrote:
ruthan wrote:

This Windows stuff is nice too, but i wanted to know if such info is available even in DOS in theory, or is inaccessible from it.

I don't think such information is available in DOS from the Bios ROM area at 0xC0000. This kind of info would occupy valuable space without any advantage since it's never needed in DOS. Of course in theory a DOS executable can be written that could extract this information e.g. from full firmware dump files (similarly to Windows based Bios editor utilities) but I do not find much sense in doing this.

Thanks, i just was wonder if lack of information in DOS HW info utils, its only due to platform age, or there are additional limits.

Falcosoft wrote:

Then your Pascal training certainly was not too deep 😀

No doubt about that i was something like 20 lessons hour and half long and we have to wait for some slower ones and there was also fast internet for free. Other knowledge i had from my uncle, but he was jack of trades - originally theater audio engineer, but he was able to everything form board design to microchips programming to computer programming, later he move to C and did some Linux stuff.. After that he has some problems with law, some debts.. now he is 10 years awol, he can be dead or live some successful life in other country under second identity, you never about in case these very clever guys.. Until i started do some minor computer stuff for local Astronomic observatory on high school, i never had any programming books. I lived with my father, he dont believe that child need to too much care / stuff except clothes (i which often received as present from distant relative ) and some food, i spend all my money (not too much) on keeping my PC (which i bought for money received from other family members during my previous life), able to play games + video game / fantasy magazines and when something left i bough some fantasy roleplaying stuff..

Falcosoft wrote:
ruthan wrote:

I think that i understood, but its still seems to be apples to oranges.. write lots of code in ASM wouldnt be effective from development time perspective, unless you would be really very good with ASM.

I do not think so, but maybe it's my mistake. So I try a more detailed explanation. For me at least it's much easier and faster to build the frame/base/event handling/GUI related parts of a project in Object Pascal than in C/C++ (not to mention faster debug/modify/recompile cycles). For most desktop programs it's true that performance critical parts are only a smaller part of the whole program (say e.g. 10%). The other parts are rather user interaction handling for setting parameters etc. where even a Basic interpreter's code would be fast enough. Overall development time for a whole program is not more but less since you can write the frame/base/GUI parts (that represent 90% of code) faster and so you have more time left to optimize the speed critical parts even with hand made assembly code. But this approach also has a disadvantage: namely assembly is inherently platform dependent. So for multi platform projects it's not working and a fast C/C++ compiler is a better choice.
So for multi-platform/mobile development I would never choose Delphi. But for Win32/64 development even for new projects I still choose it.

Ok, i have to agree had Pascal is more user friendly => faster to developer, if we ignore some fancy modern stuff like some Visual Studio no free feature like static code analysis, i dont thing that Delphi has big support. That only fragment of code is import for performance perspective, its true unless again you do video modern games, when you have at least several parallel threads and every ms counts..
I was argue about other thing.. when we assume that native Pascal code would be slower than C, that you even have to use ASM and have to use it more often, because to be true, most of programmers not like to that, probably at the schools you still have to past some ASM exam, but majority of coders just ends with that and live whole their live without that.
To be honest, last time when i heard about ASM optimalization excepts low level consoles stuff was in when Ken Silverman was tuning Build engine, i dunno if someone still some low level stuff on PC, maybe shaders could be tunned on ASM level..
It remind me some advantage of C#, never used it personally, but there is something called unsafe mode, when you can use speed and power of C++, i never that for example Java has something like that. BTW multi-platform development - for most people today its just shielded by some engine / framework, were you still have to from time to time care that something is not working on some platform and fix it and keep slowest platform in mind, but thats all, you often even dont have possibility to tune low level platform stuff.
To be honest if ignore some web and scripting stuff, only multiplatform and best from porting hassle working stuff is now Java (slow like hell), even with C is lots of hassle because is much more low level. C#, if something was not changed with MS stuff in last years, i has for multi-platform only Mono port its not optimal either.

To be continue, its too long for me address all things in 1 post, wait pleas than i will address second part.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 30 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

Pascal / C stuff..

Freepascal:
I just tried it and compiled hello new word with it for fun and my Antivirus Avira though that is some virus,i guess that if it would be more used on marked, that they would exclude that in their rules..
By multimonitor, etc.. i ment only it works under modern Windows, DOS stuff is not fully supported on modern Windows and use it with Dosbox or in Virtual machine is worst than native use.

FalcoSoft wrote:

The question is why not? More precisely why do you think it cannot? It seems you miss the point how native Windows compilers work. Windows is a bunch of libraries. These libraries (core functions in kernel32.dll, user32.dll, gdi32.dll, comctl32.dll, comdlg32.dll , ole32.dll etc.) export the functions a program can use. If a new feature is added to Windows it means either a new library or a new function in an existing library. Even an older compiler that was written long before these new Windows features added can use them, only a new import unit is needed for the new functions.
There is no new MS supplied midleware/CLR runtime needed, you just need to know the interface (parameters etc.) of the new functions from MS documentation and you have to write a proper import for them (of course newer versions of FreePascal/Delphi compilers include the import files and components necessary for new features built-in).
The compiled executable is a real x86/x64 binary that can talk directly to Windows libraries without an intermediate runtime.

Hmm is never needed to understand this deeply. I used Visual Studio or sometimes for smaller stuff Mono develop or some middleware were not needed to care, how exactly with connected with rest of Windows.. i have to know which version of runtime or redistrutables are needed to for users to execute final app, but i though that there would have FreePascal some problems if it does not have problems, great.

FalcoSoft wrote:

I think it's mainly because if you do not use the developer tools of the platform owner you are a 2nd class citizen. You have to putter with many things and make many things yourself. But once it's done it's yours (even for later projects) and it works the way you like it. If you use the platform owner tools (Windows ->MS Visual Studio, Android -> Google Android studio, IOS, OSX -> Apple Xcode ) you get everything ready-made. But being a 2nd class citizen has advantages too: When you have to write your own stuff instead of getting it ready you will have a deeper understanding of how the used API works and overall how your whole project works.
So for my hobby projects I never feel it as a disadvantage. In my free time I enjoy tinkering/optimizing or writing my own ports from C code or import units/components from scratch. There's no rush to market 😀.
And contrary to my average company projects (that nowadays are rather Lego-ing/Gluing and more than 50% of the project's code is not written by the project's programmers but downloaded by Nuget, Npm etc.) I really know what part does what and when a bug is reported I have a much better chance to fix it myself without having to wait for someone else since it's my own code.

I meant it to the past too, i mean time before Windows where main platform, performance heavy thing as OSes, games where still mainly developed in C for some reasons. Yeah know you are much more pressed to use MS stuff on Windows.
I understand your approach, i make sense, if your scope is reasonable - i mean, i still know guys which are able to keep up with development with their own web frameworks or with some desktop / 2D tools apps. Its nice to have low level understanding and you can do lots of nice stuff in area, where lots of people would get stuck or say that is not possible.
There are also some big third party components, plugins which have to be compatible, supported etc.. and no way around it, at least in games.. because you would again develop the wheel.

I know lots of guys which tried to write their old video games engines, at some time 1995-2002 or sometimes thing like that i was possible to made competitive stuff in small team, but after that all except one very, very clever guys failed and embraced some proven technology or have to go to big companies. That only guy which still keep up, had to move out of games and make specific tools for some designer and architects stuff and have to train own people to use it and deliver the most of the project, or it least heavily support external users.
I was lucky that stopped with own engines early with Quake and discover that for me its much more fun to design than coding, im still able write some small stuff for myself, mostly some data conversion tools. Otherwise im just part-time mercenary for short projects and most of the my project are using some specific API / frameworks.. which are not usefull for other projects.. and my advantage is that im able to adapt and i have general overview on different layers.. but coding is usually only smaller part of my work, otherwise i do some analytic stuff.
In games if i have time, im still able to do game-play development, what is usually glorified scripting at best and with evolution of engines its less and less about coding and more about system design - the most of engines now have some flow diagrams visual programming stuff or some slow Lua, Python etc.. scripting lang support for game world objects manipulation / settings.. and if you really need to some performance heavy things, they could be moved to C. I never was particularly obsessed by 3D rendering, physic or other stuff which would need more advanced math or science, or algorithmization / code structure itself, but its good that some guys are and its good to have them in team:)
That balance between scripting capabilities and low level engine combination could seem to be easy to solve, but on previous gen of HW / consoles.. the most of major engines had problems with that, so make some big stream-able worlds with lots of objects was almost unseasonable problem, most of games had smaller corridors, levels etc.... now its much better, but still some strange performance hiccups are generated by that.

Thanks for lots interesting info.

Update: Good benchmarks for lang / compiler speeds, i saw a lots of benchmarks with different results, i really dont know which are best ones.. I tend to preffer something more complex, problem with Pascal is that most of benchmarks are ignoring it.. It really depends on use case. For example i know during searching find this.. there is big variability - https://github.com/unixpickle/Benchmarks and its lots of results are missing big picture view, for example Java could be fast, but could consume much more memory.. and even if not performance could be fine until is garbage collection executed and if i would like to use it for game, i cant afford whole app freeze for tens of ms and JVM tunning is not easy task and, it could be probably adjusted for games to be fast enough, but overall performance would be probably suffer.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 31 of 54, by Falcosoft

User metadata
Rank Oldbie
Rank
Oldbie
ruthan wrote:

Freepascal:
I just tried it and compiled hello new word with it for fun and my Antivirus Avira though that is some virus,i guess that if it would be more used on marked, that they would exclude that in their rules..

Nowadays AV engines are overly paranoid. They think 100 false positives are still better than 1 false negative and make the life of small developers very hard:
http://blog.nirsoft.net/2009/05/17/antivirus- … all-developers/
https://coolsoft.altervista.org/en/blog/2018/ … mall-developers
http://falcosoft.hu/shoutbox/index.php?message=292

Also the worst kind of heuristics I have ever seen and used by many AV vendors (even by Windows Smartscreen)
is 'Reputation based'. It means a Catch 22 for small developers. The program is flagged as malicious just because it's not used by many others. That's it. But how a program can be used by many others when it's flagged/deleted right after it's released? (Maybe this 'very intelligent' algorithm deleted also your compiled file)

Website, Facebook, Youtube
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
VST Midi Driver Midi Mapper

Reply 32 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie
Falcosoft wrote:

Nowadays AV engines are overly paranoid. They think 100 false positives are still better than 1 false negative and make the life of small developers very hard..

Yeah you are right, its stupid, but it also mark of how much widespread some tool is.. because if lots of people complain they usually fix false detection.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 33 of 54, by MERCURY127

User metadata
Rank Member
Rank
Member
ruthan wrote:

We have first result which shows that card manufactor actually matters, my Microstrat Geforce 7600 GT reports Vbios size 59KB, Falcosofts Inno3D only 55 KB. My cards its PCI-E with 2x DVI + S-Video, maybe it matters too.

it is NOT real size.
GF 6xxx+ series vbios TOO BIG, he even have packed fonts in bios, and also have diff methods for unpacking - in-place and via crt registers. w/o these tricks size should be 80-90 KiB
real size these bioses, whish he occupied in flash - approx 60-62 KiB in packed state. on POST process, loaded in system memory code do his tasks, then unpack fonts and move/overwrite self (partially). after, code do other init tasks, cut some own temp code/data and, at end, modify own size byte to final size. after this, main bios code read this byte and set shadow registers to block write in this area.

Reply 34 of 54, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

All results are now CheckIT free.

MERCURY127: Wait for FalcoSofts response.. Im not expert here, i know that is some special memory are dedicated for text modes i think B000-BFFF by design or in theory that unpacked things could be saved outside of conventional / upper memory and here is plenty of place, we care about free conventional memory.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 35 of 54, by Falcosoft

User metadata
Rank Oldbie
Rank
Oldbie
MERCURY127 wrote:

it is NOT real size.

I do not think MERCURY127 followed all posts of this topic ( I understand this since there was so much off-topic talk 😀 ) since we already know this:

Falcosoft wrote:

If you dump video bios (firmware is a better term) of modern video cards you can see that the size is usually 128 KB+. Geforce GTX 960 full video bios dump file size is 176 KB. There is no such space in C0000 - xxxxx range in real mode to store the copy of full video firmware anymore. Not to mention that a modern card's firmware contains routines that are completely irrelevant under DOS such as dynamic performance modes/power saving features, all 3D acceleration related stuff. And modern 32-bit/64-bit operating systems never use the video bios available in real mode at C0000-xxxxx range but use own drivers.

But the real question is what size is relevant under DOS if we want more UMB/Conventional memory etc. And the answer is the relevant size is NOT the real size of the whole firmware/BIOS.
Case study Geforce GTX 960: The whole firmware size of this card is rather huge: 176 KB. But under DOS VBSize/NSSI reports only 58KB. Let's see which is the relevant size.
If you look at the whole firmware dump file you can see that the part that makes the payload that is available at 0xC0000 starts at 0x800 in the firmware dump file and makes only about 1/3 of the whole firmware.
I have made a run-time memory dump from a working PC using this card from physical address 0xC0000 - 0xDFFFF (128 Kb). As can be seen you can really use the memory area at 60KB offset from the start of 0xC0000 (0xCF000 - 0xCFFFF, 4KB) for UMB blocks since the size reported by VBsize/NSSI (58KB) is the relevant size, so addresses above the reported size are free to use from the first free 4K aligned address to next option ROM (in case of AMD integrated memory controller using UMBPCI). Then at address 0xD0000-0xD4FFF you can find the AHCI option ROM. And from 0xD5000 - 0xDFFFF there is another UMB area.
So to sump it up it seems under DOS not the real size of the full firmware/BIOS is relevant, only the smaller payload part that is reported by VBSize/NSSI.

Filename
GM206_original_full_rom.zip
File size
136.3 KiB
Downloads
85 downloads
File license
Fair use/fair dealing exception
Filename
PCDUMP_C0000-DFFFF.zip
File size
60.53 KiB
Downloads
83 downloads
File license
Fair use/fair dealing exception

Website, Facebook, Youtube
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
VST Midi Driver Midi Mapper

Reply 36 of 54, by MERCURY127

User metadata
Rank Member
Rank
Member

Falcosoft, yes, sorry, i read not all posts. 😀
but... in ur upload, i see only ONE normal x86 vbios, at 800-fbff.
and its look not as packed... also there is NO dos fonts.
second block at fc00-223ff look as packed... or as non-x86 device code.
strange. i more like see dual legacy and uefi/gop blocks.

Reply 37 of 54, by MERCURY127

User metadata
Rank Member
Rank
Member

I think, we have no chance to slipstreaming newer bioses. Even for oldest videocards 90x years its impossible. These is too complex devices for modify.

Reply 39 of 54, by MERCURY127

User metadata
Rank Member
Rank
Member
Falcosoft wrote:

Case study Geforce GTX 960: The whole firmware size of this card is rather huge: 176 KB.

i think, u get broken image. full image must be 128 or 256 KiB... 😕