OMG! ZIRIA! ZIRIA!!! IT ACTUALLY HAPPENED!! 34 YEARS LATER!! The epic/legendary Tengai Makyou/Far East of Eden: Ziria JRPG has finally been localized! Supper the Subtitler struck again! Simply unstoppable, NOTHING can prevent him from TOTAL PCECD localization domination!!!! WHACHA GONNA DO BROTHER?!?!
Main Menu

Is this an alternative to C?

Started by elmer, 06/21/2016, 12:07 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

elmer

The big problem that I'm seeing with both HuC and CC65 is that they're both really, really dumb.

They seem to often generate lots of code that they don't need to.

Now, maybe SDCC will be smarter ... but that's still to be determined.

The question in my head keeps on coming back to "if the code that's produced is going to be so bloated and bad that I've got to rewrite it in assembly language, then why not just write it in assembly language from the start?".

Well, at least for me, one of the annoyances of normal assembly is just how much the minutiae of the language actually obscures what I'm trying to do.

As Arkhan pointed out ... it's hard to experiment with stuff when you're dealing with all that "clutter".

A good Macro Assembler with a large and well-designed set of macros can really help out there, but it's still not as friendly as I'd like.

I've just found NESHLA ... that takes an interesting approach to wrapping the annoyances of normal assembly language into something that superficially looks a lot more like C, but with none of the overhead of that stack-based language.

I've done a search, but nobody seems to have discussed this language here, what do folks think?

Does any else but me like the idea of something like this as a front-end that would generate code for PCEAS or CA65?

http://neshla.sourceforge.net/

Arkhan Asylum

That could be neat, but only if it really does an optimal job, otherwise, you can just do the C-->ASM by Hand thing.

As it turns out, much of the game logic for a game is pretty stupid easy, once you know what it SHOULD do.

So, the conversion isn't very painful.
This "max-level forum psycho" (:lol:) destroyed TWO PC Engine groups in rage: one by Aaron Lambert on Facebook "Because Chris 'Shadowland' Runyon!," then the other by Aaron Nanto "Because Le NightWolve!" Him and PCE Aarons don't have a good track record together... Both times he blamed the Aarons in a "Look-what-you-made-us-do?!" manner, never himself nor his deranged, destructive, toxic turbo troll gang!

elmer

#2
Quote from: guest on 06/22/2016, 12:57 AMThat could be neat, but only if it really does an optimal job, otherwise, you can just do the C-->ASM by Hand thing.
I've taken a look at the source, and NESHLA takes the (IMHO) dumb move of actually being a full assembler and producing its own binary NES ROM files. That makes it unsuitable right now for the PCE.

But the concept of what he's doing there doesn't really need that.

The idea of that C-like structured assembler could just as easily be done as a pre-processor that spits out PCEAS or CA65 source files.

A pre-processor like that is easy to write these days.

All that it really is is just a way of writing assembly language code that looks more C-like, makes the code more readable by not obscuring the structure, and stops the programmer from having to manually deal with all those local labels and junk that makes assembly language more tedious than it needs to be.

What he's actually doing in NESHLA maps exactly to assembly language ... there is no overhead at all.

That's what I'm liking about it (as an idea). It's minimal.

I'm feeling that by the time that you've massaged your C code to use all static variables and to avoid generating all the "expensive" code, then you're basically back to simple instructions anyway, so why live with C's overhead?

BTW ... the cost/benefits of the NESHLA structured-assembler concept may change if SDCC eventually produces code that doesn't suck. At least it's going to be using a small-and-fast stack for any stack-based local variables/calculations.

TurboXray

Quoteso why live with C's overhead?
This is the problem I have with HuC. I haven't messed with CC65, so m not sure about that, but it's not really the output code as it is the structure overlaying on top of it. The code can be optimized with replacement assembly stuff, but the structure is a pain in the ass (some stuff is impossible to do with HuC; no amount of assembly will fix it).

 I've seen NESHLA, and to be honest it always pops into my head when these C discussions come up. I like the idea very much, but there's one catch - if you need to have a strong grasp of assembly for the processor, as well as the surrounding hardware. So something like NESHLA is for assembly programs, not beginners or strictly C enthusiasts.

 For me personally, the only real nuisance of Assembly, is the wading through the listing. Everything's really long and so it's more difficult to quickly parse through stuffs. Having multiple files helps cut it down, but I always forget which file contains what.. so looking through them is annoying too. Writing code, designing logic and structure - none of that is a problem for me in assembly.

 I would definitely use PCEHLA.

elmer

#4
Quote from: TurboXray on 06/22/2016, 12:07 PM
Quoteso why live with C's overhead?
This is the problem I have with HuC. I haven't messed with CC65, so m not sure about that, but it's not really the output code as it is the structure overlaying on top of it. The code can be optimized with replacement assembly stuff, but the structure is a pain in the ass (some stuff is impossible to do with HuC; no amount of assembly will fix it).
By "structure", do you mean all that messing about with putting expressions/temporary results on the C stack?

CC65 isn't really much different to HuC, it's just been optimized a lot more ... but then goes and throws away a bunch of that optimization by calling subroutines for some stuff instead of expanding the code in macros.

Either way ... both of them use "[sp],y" and "inc/dec sp" all the time which kills whatever is left of your performance.

They both do all the stack-based temporary math as 16-bit, but CC65 implments a lot of peephole optimizations that can sometimes make it somewhat less expensive than HuC.

But, it's just (IMHO) dumb to keep doings things as 16-bit when you don't actually need to.

*************

SDCC's code generator is completely different, which is why it stands a chance of delivering better code.

It internally figures out all the local variables, calculations and math in terms of temporary results (8-bit/16-bit/32-bit signed/unsigned), and then does lots of trial passes through the code-generator to decide which of these temporary results should actually be in registers or real locations in memory (either as static locations, or on the stack).

Since a small-stack access can be "abs,x", that's just as fast as a static "abs" reference on the PCE.

That's not too conceptually different to how an assembly-language programmer thinks.

I just don't know, yet, how well it's all going to work out in practice because I'm entirely dependent upon the SDCC "guru" to get the code-generator written.

The SDCC codebase itself is barely documented, and seems to rely on a lot of insider knowledge of state information that I just don't understand.  ](*,)


QuoteSo something like NESHLA is for assembly programs, not beginners or strictly C enthusiasts.
Yes, definitely.

There's not much that can be done for people that will only use C, but back in my day, assembly language was for beginners, or at-least almost-beginners.  :-k


QuoteFor me personally, the only real nuisance of Assembly, is the wading through the listing. Everything's really long and so it's more difficult to quickly parse through stuffs.
Yes, that's where NESHLA's approach really appeals to me (right now).

Assembly language isn't really that hard, but traditional assembly code can be a bit of a PITA to follow because of the verbosity and with the actual structure being hidden inside lines of code that look identical to the eye.

It's like running C code through an obfuscator that removes all the spacing and line breaks and just puts everything on the same line.

Perfectly legal C code ... just a huge PITA to read and understand.


QuoteWriting code, designing logic and structure - none of that is a problem for me in assembly.
Yep, assembly coding itself usually isn't that difficult.  :wink:

Gredler

Quote from: elmer on 06/22/2016, 01:05 PMYep, assembly coding itself usually isn't that difficult.  :wink:
#-o ](*,) [-(

elmer

Quote from: Gredler on 06/22/2016, 06:11 PM#-o ](*,) [-(
Hahaha ... no, really, I mean it!  :lol:

Don't forget, teenagers at school were doing it 30 years ago.  :-"

dshadoff

Quote from: elmer on 06/22/2016, 07:18 PM
Quote from: Gredler on 06/22/2016, 06:11 PM#-o ](*,) [-(
Hahaha ... no, really, I mean it!  :lol:

Don't forget, teenagers at school were doing it 30 years ago.  :-"
It's true.  I did.  But it's not like there were hundreds of languages available for home computers until 5-10 years later.  Some of the languages which were available were of questionable use (look up 'logo' and 'Forth').

Z80 assembler was the second computer language I learned (BASIC was first).
6809 assembler was the next one.
And I was about to learn 6502 as my next language (but the 6502 was dying by that time).

...But this was one of the problems of assembler - one had to conform to the system one was writing on.
After the plentiful registers on the Z80, it took a lot of effort to wrap my head around the minimalist 6809.

...we were patient back then.  I recall saving my programs on 500-baud cassette tapes for 26 minutes per copy.  (Of course, I made backups too).


Oh, and can you guess why anybody would learn assembler ?
Because it was about 100 times faster than BASIC.

elmer

Quote from: dshadoff on 06/22/2016, 08:57 PMSome of the languages which were available were of questionable use (look up 'logo' and 'Forth').
Amazingly, the vestiges of FORTH still exist. What a beautifully horrible language.

A masterpiece for it's time, and an absolute nightmare.

I considered making AtLast (AutoDesk's interesting 32-bit derivative of FORTH) the in-game scripting language for our company's game engine.  :oops:

After it because clear that every other programmer would quit, I relented and we implemented something usable instead.  #-o


QuoteZ80 assembler was the second computer language I learned (BASIC was first).
Me, too. What machine did you get to start on? I think that you're an American, so perhaps an Altair or TRS-80 or something like that?  :-k

My school got a RML-380Z with a 300 baud cassette, and I was one of the early users of that.

https://en.wikipedia.org/wiki/Research_Machines_380Z

Same path ... BASIC, then assembly ... mostly so that I could do incredibly "mature" things like hacking the BASIC interpreter, which loaded from cassette, to say "TAMPAX ERROR" instead of "SYNTAX ERROR".

Some of us were were nerds, even back before the term was coined.

After the school finally bought a dual 8-inch floppy drive, there was no stopping me!


Quote6809 assembler was the next one.
And I was about to learn 6502 as my next language (but the 6502 was dying by that time).
I'm jealous ... I've always wanted to have a chance/reason to play with a 6809!

I went straight from the Z80 to the 6502 (on the Apple II).


Quote...But this was one of the problems of assembler - one had to conform to the system one was writing on.
After the plentiful registers on the Z80, it took a lot of effort to wrap my head around the minimalist 6809.
And this is really the issue. Early microprocessor architectures were very different, and they weren't designed with high-level languages in mind.

This is why C is such a lousy fit on the PCE.

It wasn't until the late 1970s that the M68000, Z8000 and NS32000 came out (that were all based upon ideas from expensive minicomputers), that microprocessor CPUs were really designed to run high-level languages.

You can just about throw Intel's 8086/8088 in there, too ... but it was mainly designed for backwards-compatibility (at a source-code level), rather than forward thinking.

Unfortunately, IBM/Intel won the marketing battle, and we've all spent decades locked into that horrible architecture until AMD finally came up with the x64 improvements, and Apple made the ARM chip sexy.

P.S. No, I'm not forgetting MIPS, SPARC or PowerPC ... they just had a very limited niche.

spenoza

Quote from: elmer on 06/22/2016, 09:50 PMP.S. No, I'm not forgetting MIPS, SPARC or PowerPC ... they just had a very limited niche.
PowerPC was more widely used than many suspect, though in some weird places.

elmer

Quote from: guest on 06/22/2016, 10:03 PMPowerPC was more widely used than many suspect, though in some weird places.
Yep, good point. It would count as more successful than the Z8000 and NS32000 combined ... and that's just in the GameCube!  :wink:

Being a game guy, I can think of Apple PowerPCs, GameCube, Wii, WiiU, Xbox 360 and PS3 ... and I'm sure that it must have had a different and successful life in the embedded world, too.

Didn't it also get used in some of IBM's server and supercompters somewhere?

Can you think of other examples to help me out?

TurboXray

#11
I did BASIC as a kid, and learned C with OOP (that's what it was called in 1993/4) for about 6 months as a late teen. In 1999, I learned x86 assembly and z80 assembly at the same time. That was about 6 months, then directly switched over to GBz80 assembly (Gameboy) for another 6 months. And then stopped.

 It wasn't until 2005 that I started huc6280 assembly. About a year later I started learning C again (this time without OOP nonsense) so I could write support utils on the PC. The rest is history (with other processors thrown in). So it wasn't until 2005, when I was 29, that I really delved into assembly. I loved it.

 65x was weird to get used to at first, but I never did like x86. It felt soo convoluted in approach (I did page mode addressing) - even without having any other experience with other processors. Looking back now, after coding more modern processor designs, I was right. Going back to z80 after all these years, I don't like that it doesn't have direct memory load/store into the A register (everything is pretty much done through the A reg like on the 65x and 68x and other related processors, so this bottleneck sucks). Too much register juggling - and stack use. The nice but limited 16bit macros (hardware macros) don't make up for it.

 Off topic: I read that AMD was going to release a new PC processor with an ARM core. I thought this was a fantastic idea; much better ISA - and get away from the x86 design. But I haven't seen anything about it in more than a year. I thought that would be the next big thing like x64, but I guess it silently felt through the cracks.

 Edit:
Pic 16f84 and related series MCU are the worst processor that I've ever coded assembly on. That processor is insanely crippled. About the only thing worse would be trying to program a DSP chip as a regular processor (it's almost the same level of convoluted-ness).

spenoza

Quote from: elmer on 06/22/2016, 10:20 PM
Quote from: guest on 06/22/2016, 10:03 PMPowerPC was more widely used than many suspect, though in some weird places.
Yep, good point. It would count as more successful than the Z8000 and NS32000 combined ... and that's just in the GameCube!  :wink:

Being a game guy, I can think of Apple PowerPCs, GameCube, Wii, WiiU, Xbox 360 and PS3 ... and I'm sure that it must have had a different and successful life in the embedded world, too.

Didn't it also get used in some of IBM's server and supercompters somewhere?

Can you think of other examples to help me out?
IBM used it for a while in the Power series servers, and the 603 and variants were used as embedded controllers for lots of stuff. Various 603 versions powered Sega's impressive Model 3 arcade boards (precursor to Naomi/Dreamcast) and Konami used a bunch of PowerPC variants in arcade machines as well. You can bet Motorola was doing their best to sell that chip to anyone who would buy, because Apple wasn't necessarily perceived as a safe business partner at the time.

spenoza

Quote from: TurboXray on 06/22/2016, 10:21 PMOff topic: I read that AMD was going to release a new PC processor with an ARM core. I thought this was a fantastic idea; much better ISA - and get away from the x86 design. But I haven't seen anything about it in more than a year. I thought that would be the next big thing like x64, but I guess it silently felt through the cracks.
They are developing it for low-power, low-cost blade servers. It's called the Opteron A. No idea if it is a successful platform.

http://www.amd.com/en-us/products/server/opteron-a-series

OldMan

QuoteI did BASIC as a kid, and learned C
in order, as I remember it....
Pascal. Fortran 4 then 77.Cobol.  PL/1. 360/370 assembler. Logo. Basic. C. 6809 assembler. 386 assembler
Then, in no particular order...
C++ / Java /C# /MIPS assembler / and finally, 6502 assembler.

Most of the early stuff was in college. 2 years of fortran, right around the change over from 4 to 77.
I also learned a smattering (enough to do the ssignments) of Lisp and prolog, but not really enough to do anything serious with.

Logo/Basic/C/6809 were from my CoCo days, near graduation. That's when I fell in love with C - and learned to speed it up with assembler :) Somewhere around here I still have a kABOOM! clone for the CoCo. Analog sticks were awesome .

The rest were things I picked up helping my nephew and others do their school work. C++ was nice, if inefficient. Java was worse. 386 assembler was...meh. Mips assembler was courtesy of the ps1/2 and an introduction to linux. (Remember when it would boot from a single 1.4M floppy?)
And now, the pce and 6502 based assembler.

And my best programming memory of those times...Setting up a network (wow 10Mbps) in house, and being challenged to write battleship. Over the network. My nephews friend used visual c, I used Borland C. He turned it in as a school project ("Write a simple game"), showing how to do networking in both languages...and almost got expelled because everyone in school was playing and dragging down the network.

Good times.

OldMan

Quotefriend used visual c
My bad. Visual Basic, not visual c. He did a lot of wierd things in Basic.

TurboXray

Quote from: TheOldMan on 06/22/2016, 11:23 PMThe rest were things I picked up helping my nephew and others do their school work. C++ was nice, if inefficient. Java was worse.
My first two courses for CS are Java based. It's my first real exposure to Objects and classes (I stated away from that stuff in C++). It's not bad, so far. I have no idea what the performance is on JVM, but I'm thinking of doing some GUI based utils with it (cross-platform portability interests me). I lucked out because they're ending Java classes at the college and replacing them with Python. So I avoided Python just in time (no pun intended). We're also doing a lot of JUnit testing, which I've found to be a really nice tool for testing classes and methods. After which, we have to submit our code to an online server that does additional range checks in real time.

dshadoff

Quote from: elmer on 06/22/2016, 09:50 PMAmazingly, the vestiges of FORTH still exist. What a beautifully horrible language.

A masterpiece for it's time, and an absolute nightmare.

I considered making AtLast (AutoDesk's interesting 32-bit derivative of FORTH) the in-game scripting language for our company's game engine.  :oops:

After it because clear that every other programmer would quit, I relented and we implemented something usable instead.  #-o
Yeah, Sun Microsystems was using FORTH in their OpenBoot PROMs in the late 90's, and I was sent to Palo Alto to debug a broken SCSI tape driver in OpenBoot.  Without knowing FORTH.  Or SCSI.  But all was well, as it was easy enough to fix (just obscure, and tedious to test).

Quote
QuoteZ80 assembler was the second computer language I learned (BASIC was first).
Me, too. What machine did you get to start on? I think that you're an American, so perhaps an Altair or TRS-80 or something like that?  :-k
Well, Canadian - but you're close.
Yeah, TRS-80 Model I.  I bought it with the money I made on a paper route, I guess it was 1979 or 1980.

Quote
Quote6809 assembler was the next one.
And I was about to learn 6502 as my next language (but the 6502 was dying by that time).
I'm jealous ... I've always wanted to have a chance/reason to play with a 6809!

I went straight from the Z80 to the 6502 (on the Apple II).
I almost went the Apple II route; I went TRS-80 Color Computer instead.

QuoteBeing a game guy, I can think of Apple PowerPCs, GameCube, Wii, WiiU, Xbox 360 and PS3 ... and I'm sure that it must have had a different and successful life in the embedded world, too.

Didn't it also get used in some of IBM's server and supercompters somewhere?

Can you think of other examples to help me out?
A more modern descendant is still in IBM's System p servers running AIX.  They recently announced POWER9.

Other than that, I've seen it used as an embedded processor in things like printers, but that use is quickly being replaced with ARM variants and low-power x86.

elmer

Quote from: TurboXray on 06/22/2016, 10:21 PMI did BASIC as a kid, and learned C with OOP (that's what it was called in 1993/4) for about 6 months as a late teen. In 1999, I learned x86 assembly and z80 assembly at the same time. That was about 6 months, then directly switched over to GBz80 assembly (Gameboy) for another 6 months. And then stopped.
Interesting ... I thought that you were much younger, but we're really not that far apart (in cosmic terms). You just followed a different path to some of us ancient proto-greeks.  :wink:


Quote65x was weird to get used to at first, but I never did like x86. It felt soo convoluted in approach (I did page mode addressing) - even without having any other experience with other processors. Looking back now, after coding more modern processor designs, I was right. Going back to z80 after all these years, I don't like that it doesn't have direct memory load/store into the A register (everything is pretty much done through the A reg like on the 65x and 68x and other related processors, so this bottleneck sucks). Too much register juggling - and stack use. The nice but limited 16bit macros (hardware macros) don't make up for it.
IMHO, it's just like the 6502 ... fatally flawed, but in it's own unique way.

BITD we used to draw sprites by setting the SP to the target location and then "LD HL,#$nnnn; PUSH HL", pretty much like your PCE "ST0" commands.

It's just another early architecture that rewards nasty "trick" programming ... just like the GameBoy's not-really-Z80 variant.


QuotePic 16f84 and related series MCU are the worst processor that I've ever coded assembly on. That processor is insanely crippled. About the only thing worse would be trying to program a DSP chip as a regular processor (it's almost the same level of convoluted-ness).
Have you looked at the insane restrictions of the 8051 architecture?

If SDCC can do a "bearable" job on that CPU, then I hope that we can end up with something "usable" on the 6502.
But it's not really up to me ... I'm just the "support" guy hoping that the "guru" doesn't bail out!  [-o<


Quote from: guest on 06/22/2016, 10:28 PMIBM used it for a while in the Power series servers, and the 603 and variants were used as embedded controllers for lots of stuff. Various 603 versions powered Sega's impressive Model 3 arcade boards (precursor to Naomi/Dreamcast) and Konami used a bunch of PowerPC variants in arcade machines as well. You can bet Motorola was doing their best to sell that chip to anyone who would buy, because Apple wasn't necessarily perceived as a safe business partner at the time.
Ahhhh ... sounds like you may have had a career in "big-iron" or in sales. I'd love to pick your brains over a beer sometime!  :wink:


Quote from: TheOldMan on 06/22/2016, 11:23 PMin order, as I remember it....
Pascal. Fortran 4 then 77.Cobol.  PL/1. 360/370 assembler. Logo. Basic. C. 6809 assembler. 386 assembler
Then, in no particular order...
C++ / Java /C# /MIPS assembler / and finally, 6502 assembler.

Most of the early stuff was in college. 2 years of fortran, right around the change over from 4 to 77.
Sounds like you're a few years older than me. I missed all of the historical mainframe languages, and started HLLs with Pascal and C.

It's been interesting to see the progression of Niclaus Wirth's ideas in Pascal vs Kernighan & Ritchie's idea in C.

While Pascal/Oberon/etc are pretty much dead to the world ... Wirth's ideas live on somewhat in Anders Hejlsberg's C#.

It's an interesting evolution ... and so much more of a "practical" solution than Bjarne Stroustrup's abominable C++ mistakes.


QuoteThe rest were things I picked up helping my nephew and others do their school work. C++ was nice, if inefficient. Java was worse.
C++ is an interest attempt to add object-orientation and more modern design techniques to C ... but I hate it.

Every time I look at it ... if there's a design choice to be made, Bjarne usually made the wrong one (IMHO).

Then again ... he's rich and famous, and I'm pontificating on a PCE forum, so where do I get the right to criticize?  [-X

Bonknuts, and anyone else interested in the actual "costs" of rampant overuse of object orientation, this was an interesting talk by Sony ...

http://harmful.cat-v.org/software/OO_programming/_pdf/Pitfalls_of_Object_Oriented_Programming_GCAP_09.pdf

OldMan

QuoteSounds like you're a few years older than me.
54. And I don't care who knows it. At 50, I earned the right to be  grumpy a$$hole :)

Quote...Kernighan & Ritchie's idea in C.
One of my professors knew Kernighan . He worked at Dec on the Vax stuff. (The professor. dunno bout Brian) Said they used to do pre-processors like c++ all the time. He wasn't impressed by it. Or by Ratfor, either.

QuoteC++ is an interest attempt to add object-orientation and more modern design techniques to C ... but I hate it.
I don't -hate- it; it has some good points (Mostly keeping stuff out of the way.)
But I do like to know when garbage is being picked up, and not having to chase through 30 layers of code to find out "Oh. That's a bloody one-line function, overloaded 60+ different ways...Now which one is actually being used?"
I honestly like the C pointer-to-function stuff better.  Those I can use in arrays. :)

Arkhan Asylum

I started with QBASIC, C+some 68k on the Amiga, and BASIC/6502 on the C64.  But the C64 never really went well because the fucking disk drive would goon out all the time. 

assembly in general never went well because alot of it was me winging it as a child, with whatever books we had laying around.   Most of my fun was in QBASIC because I'd fuck with all of the games and make them unplayable so I could watch my sister get demolished by the CPU.    I made the Pac Man game go way too fast, and the ghosts would just zero in on you immediately and kill you.   

lol, and I made Nibbles controls reversed so she couldn't play it.  It was pretty hilarious.


by the time I was really coherent, though, C++ was a thing, people were running away and crying at assembly, and everyone was doing the gradual shift towards the current programming paradigm of being a lazy, pretentious dickhead chasing whatever the new hotness is.

First, it was Java, now it's C# + any web platform.

it's all pretty dumb.


This "max-level forum psycho" (:lol:) destroyed TWO PC Engine groups in rage: one by Aaron Lambert on Facebook "Because Chris 'Shadowland' Runyon!," then the other by Aaron Nanto "Because Le NightWolve!" Him and PCE Aarons don't have a good track record together... Both times he blamed the Aarons in a "Look-what-you-made-us-do?!" manner, never himself nor his deranged, destructive, toxic turbo troll gang!

elmer

Quote from: TurboXray on 06/23/2016, 12:06 AMMy first two courses for CS are Java based. It's my first real exposure to Objects and classes (I stated away from that stuff in C++). It's not bad, so far. I have no idea what the performance is on JVM, but I'm thinking of doing some GUI based utils with it (cross-platform portability interests me). I lucked out because they're ending Java classes at the college and replacing them with Python.
There's nothing wrong with objects and classes, they're a great way to express some ideas and build some programs.

The concept has definitely proven itself valuable.

The problem comes with their overuse, typically by new adherents to the paradigm who naturally try to turn everything into an object.

I really can't understand why Python seems to be taking over so much mind share in academia ... it's just like teaching BASIC.

And sorry, but any language where exact spacing is significant ... that's f**ked-up!  :roll:


Quote from: dshadoff on 06/23/2016, 12:36 AM
Quote from: elmer on 06/22/2016, 09:50 PMI'm jealous ... I've always wanted to have a chance/reason to play with a 6809!

I went straight from the Z80 to the 6502 (on the Apple II).
I almost went the Apple II route; I went TRS-80 Color Computer instead.
I only used the Apple II for a few months for a project in my first job, and then got to do another project on the BBC Micro, I didn't actually own either of those machines.

When I had the cash, I bought an Atari 800.  :)


Quote from: TheOldMan on 06/23/2016, 02:30 AM54. And I don't care who knows it. At 50, I earned the right to be  grumpy a$$hole :)
Heck, I'm 53, and I'm definitely a grumpy old curmudgeon!  :wink:

I wonder how you got exposed to all those mainframe languages that I missed out on.

My university CS Dept was running a PDP 11 and Unix, and we time-shared on another university's larger mainframe.

IIRC, we were started with Pascal on the mainframe, and then later courses shifted to C on the PDP 11.

The HLLs all seemed very painfully slow because I'd been programming in assembly on a microprocessor for a few years before university.


Quote
QuoteC++ is an interest attempt to add object-orientation and more modern design techniques to C ... but I hate it.
I don't -hate- it; it has some good points (Mostly keeping stuff out of the way.)
But I do like to know when garbage is being picked up, and not having to chase through 30 layers of code to find out "Oh. That's a bloody one-line function, overloaded 60+ different ways...Now which one is actually being used?"
I honestly like the C pointer-to-function stuff better.  Those I can use in arrays. :)
OK, I guess "hate" is too strong a word. I just think that it's overrated and ugly and that it's taken the wrong path. I still happen to use it as my main language anyway.

If Bjarne had been a decent designer, then there wouldn't be stupid "fad" ideas in there like operator-overloading, or the curse of multiple-inheritence screwing things up, and he'd have had the courage to make all objects require a vtable and decend from a single "base" object, and then he could have implemented proper introspection into the darned language.

The whole "reference" instead of "pointer" seems to have bitten us in the ass more than it's helped, too.

It was interesting to see that C11's "anonymous" structs and unions that finally let us all implement some OO design features in nice way in C.


Quote from: guest on 06/23/2016, 04:51 AMI started with QBASIC, C+some 68k on the Amiga, and BASIC/6502 on the C64.  But the C64 never really went well because the fucking disk drive would goon out all the time.
Hahaha ... those disk drives were horrible, weren't they!  :lol:

OldMan

QuoteI wonder how you got exposed to all those mainframe languages that I missed out on.
My university CS Dept was running a PDP 11 and Unix, and we time-shared on another university's larger mainframe.
IIRC, we were started with Pascal on the mainframe, and then later courses shifted to C on the PDP 11.
Pascal was first, as an 'introduction' to programming. I didn't like it, and ended up taking a year off.
When I went back (different college, btw) I had to take fortran 4. Once that was done, the CS courses opened up. The college ran 2 tracks: Math based (ie, CS) and business based. The next couple of times I had to choose courses, I split between the 2. 1 CS course, 1 business course. Most of the odd stuff was from the business courses.
The college ran a DEC Vax 11/780, and did time share for some local places; they also ran an IBM 360/370 that did 'book keeping' stuff (hence the business degree track). Funny thing is, until my 3rd year, they didn't have C installed there. I learned it on the CoCo.

The profs were annoyed when they started offering classes in C, because by then I knew a lot of the other geeks, and they would always ask me to help debug (rather than the profs) The profs were surprised when I pointed out a few errors that they couldn't find (One of which was a bad pointer problem which would write to the stack by accident, and crash the whole system. Gotta love off-by-1 errors. :)

FWIW, I ended with a CS degree and a history minor. That's what happens when you take all the elective stuff as regular courses, and fill the last semester with things like 'art appreciation' :)

TailChao

#23
In response to the original topic -

Is the goal is to just make development faster and simpler, but still have everything run on the PC-Engine?

Why not just run everything on a microcontroller and use the console for video, audio, and input? That way you can just write everything in C (or C++) on a PC with some little emulated bits, then shove it on a sub $5 ARM. I can almost guarantee this is "cheaper" since the time investment will be lower than adding HuC6280 support to another compiler, and you get a PC version of the game for similarly low effort.

The fact that the performance will likely be better is another plus.
Nintendo did it back in the day, and there's no shame in doing it now to ship a better product.

elmer

Quote from: TailChao on 06/23/2016, 02:45 PMIs the goal is to just make development faster and simpler, but still have everything run on the PC-Engine?

Why not just run everything on a microcontroller and use the console for video, audio, and input? That way you can just write everything in C (or C++) on a PC with some little emulated bits, then shove it on a sub $5 ARM. I can almost guarantee this is "cheaper" since the time investment will be lower than adding HuC6280 support to another compiler, and you get a PC version of the game for similarly low effort.
Hahaha ... you're right, that's certainly one option.  :lol:

It may even be a sensible option.

But it's not the intellectually-challenging option. It's not "fun" (to me, at least).


QuoteThe fact that the performance will likely be better is another plus.
Nintendo did it back in the day, and there's no shame in doing it now to ship a better product.
So did Sega with the Mega-CD and the 32x.  :wink:

***************

If you just want to write a "game", and don't really care what it runs on, then there are plenty of options out there ... like Unity.

Oh, and I, personally, have no interest in doing a PC game. Been there, done that, have the polo shirt.

It's interesting (to me, at least) to try to push the old hardware and see what can be done.

I've quickly lost interest in modifying either HuC or CC65. They're too broken at a low level to ever get good results.

We'll see whether the SDCC guru manages to keep up his interest in adding 6502 support to that compiler, or whether that will fade as the difficulties mount.

He's already added support for a different architecture and maintains that port, so he knows exactly what to do, and he has a much better idea of the potential difficulties than I do.

Whatever happens ... my time investment isn't likely to need to be that great, because there's no way that I can jump into that codebase and help, it's way too opaque.

As an alternative, morphping the NESHLA concept into a PCEHLA front-end for CA65 might be interesting to do.

There are so many compiler-compiler tools out there these days, that just reading in a program in C-style-assembler and then transforming it into standard-assembler, is a pretty simple project.

Then I could get to play with the CoCo/R parser-generator again. It's so much nicer that lex/yacc, and it already has a complete ANSI C grammar written for it. Just add the symbol table, shake it all up, put it into the oven for half-an-hour, and it's done. How tough could it be? (Famous last words!  :wink:)

But, it's a low priority at the moment.

TailChao

Quote from: elmer on 06/23/2016, 03:33 PMHahaha ... you're right, that's certainly one option.  :lol:

It may even be a sensible option.

But it's not the intellectually-challenging option. It's not "fun" (to me, at least).
It may not seem intellectually challenging at first, but once you start thinking about dedicating all the HuC6280's time to banging raster effects while the ARM does your game logic it gets more interesting.

I totally understand your point though, and wouldn't actually go with this option for this particular console.


Quote from: elmer on 06/23/2016, 03:33 PMSo did Sega with the Mega-CD and the 32x.  :wink:
They certainly did, and were probably inspired by a fairly large spill of chex mix.


Quote from: elmer on 06/23/2016, 03:33 PMIf you just want to write a "game", and don't really care what it runs on, then there are plenty of options out there ... like Unity.

Oh, and I, personally, have no interest in doing a PC game. Been there, done that, have the polo shirt.

It's interesting (to me, at least) to try to push the old hardware and see what can be done.
They're certainly not bad options either. But my point in bringing up a PC version is so that people who don't own the original hardware can play the game easily. I ended up having to write an emulator for a current project since the licensing (and codebase) for existing ones were so chaotic. Although that was fun.

The best new-game-for-an-old-thing I've seen that's actually shipped is still Fantasy Zone II DX. Aside from being a stellar product for the System-16, it was also available day one for the PS2 - in 2008. I think we can do better nearly a decade later.


Quote from: elmer on 06/23/2016, 03:33 PMThere are so many compiler-compiler tools out there these days, that just reading in a program in C-style-assembler and then transforming it into standard-assembler, is a pretty simple project.

Then I could get to play with the CoCo/R parser-generator again. It's so much nicer that lex/yacc, and it already has a complete ANSI C grammar written for it. Just add the symbol table, shake it all up, put it into the oven for half-an-hour, and it's done. How tough could it be? (Famous last words!  :wink:)
I think you would be able to manage with just a good macro assembler and well developed support libraries  :) .

TurboXray

 The ARM for C code idea is actually pretty decent, if all you wanted to do was develop games in HLL for PCE. Genesis doesn't really need it, but it'd be decent for SNES too (the whole 65x and C mismangled thing).

 Assuming the cost is low, part of the problem is modifying an emulator to run an ARM core. It's possible (I have a build of mednafen that has a 68k core running as accessible via ports on the PCE hardware bank).

 I'm comfortable with assembly, so I'm good without it. But I can see it being attractive to some.  I also code for the PCE because of the challenge, and that games really didn't push its limits. That might not be the motivator for all, though.

TailChao

#27
Quote from: TurboXray on 06/23/2016, 06:00 PMAssuming the cost is low, part of the problem is modifying an emulator to run an ARM core. It's possible (I have a build of mednafen that has a 68k core running as accessible via ports on the PCE hardware bank).
I was wondering why that was in the Mednafen source for years. Good to know.


Also, I didn't mean emulate the whole ARM core on the PC. Ideally you'd define a simple messaging system between the HuC6280 and the ARM (or whatever coprocessor) on the cartridge. Then you can have all the coprocessor functionality compiled natively on the PC.

If the coprocessor is running all your game logic, you technically don't have to emulate the full PC-Engine hardware at all, just what it was supposed to do (get inputs, display this, play this noise).

For example, I have all the game logic running natively on the 6502 in the 7800. There is an ARM in the cartridge which runs a softsynth and responds to simple messages like PLAY, STOP, ATTEN, PAUSE, RESUME, and RESET. When running the game on Windows, the softsynth is native and the game is emulated. You could just as easily do the opposite.

Edit :
Quote from: TurboXray on 06/23/2016, 06:00 PMI'm comfortable with assembly, so I'm good without it. But I can see it being attractive to some.  I also code for the PCE because of the challenge, and that games really didn't push its limits. That might not be the motivator for all, though.
I agree, and yes - there is always more that can be done with any hardware.

Unfortunately, it can't be done for free.

TurboXray

#28
Quote from: TailChao on 06/23/2016, 06:19 PM
Quote from: TurboXray on 06/23/2016, 06:00 PMAssuming the cost is low, part of the problem is modifying an emulator to run an ARM core. It's possible (I have a build of mednafen that has a 68k core running as accessible via ports on the PCE hardware bank).
I was wondering why that was in the Mednafen source for years. Good to know.
It actually had its own bitmap display the overlaid the PCE display. Ports on the hardware bank, which is always mapped, was used to communicate with it and the PCE processor. The idea was that initially you load a script file, and code block, to the 68k via these ports. Then when the game runs, when it receives a special command - it displays an overlaid subtitled script for length of time. The idea was to hook all CDPLAY and ADCPLAY routines to write the sector arguments to the ports as well, which would hash into a lookup table to start the overlay script - instant subtitling for cinemas and ADPCM parts of Japanese games.

 There was also 1k of ram for the PCE hack/hook code to use, mapped in the hardware bank open bus area as well.

TurboXray

Quote from: TheOldMan on 06/23/2016, 02:28 PMFWIW, I ended with a CS degree and a history minor. That's what happens when you take all the elective stuff as regular courses, and fill the last semester with things like 'art appreciation' :)
That made me laugh. I haven't officially declared a minor, but it'll probably be psychology :P

spenoza

Quote from: elmer on 06/23/2016, 01:02 PMI really can't understand why Python seems to be taking over so much mind share in academia ... it's just like teaching BASIC.

And sorry, but any language where exact spacing is significant ... that's f**ked-up!  :roll:
My college roommate currently works for Mozilla in their education program. He's with them because they bought the startup he co-founded. He's very user-focused and loves Python particularly because, as computer languages go, it's very democratizing. I mean, he did do grad school under Jeff Raskin shortly before the man's death. So HCI is really important for him. Many schools probably assume most of their students will be doing business or web programming, since that's what most programming jobs these days seem to want, and learning those higher-level languages is a great way to dive right in. I think schools that put a lot of focus on assembly and lower-level stuff are providing students for very particular markets, rather than pushing out grads who will work their way up to databases, Java, and PHP and probably not interact beyond that world.

spenoza

Quote from: TurboXray on 06/23/2016, 07:38 PM
Quote from: TheOldMan on 06/23/2016, 02:28 PMFWIW, I ended with a CS degree and a history minor. That's what happens when you take all the elective stuff as regular courses, and fill the last semester with things like 'art appreciation' :)
That made me laugh. I haven't officially declared a minor, but it'll probably be psychology :P
I wish my college had offered a psyc minor. I took enough psyc classes. But I took one of those fuzzy interdisciplinary programs and so my courses were all over the place anyway.

Arkhan Asylum

Python is cute for math simulations and piddly little things, but as a larger language, it's pretty stupid.

Mostly because of the "tabs signal intent" thing.

Imposing formatting on the programmer is dumb.

This "max-level forum psycho" (:lol:) destroyed TWO PC Engine groups in rage: one by Aaron Lambert on Facebook "Because Chris 'Shadowland' Runyon!," then the other by Aaron Nanto "Because Le NightWolve!" Him and PCE Aarons don't have a good track record together... Both times he blamed the Aarons in a "Look-what-you-made-us-do?!" manner, never himself nor his deranged, destructive, toxic turbo troll gang!

spenoza

Quote from: Psycho Arkhan on 06/23/2016, 10:47 PMPython is cute for math simulations and piddly little things, but as a larger language, it's pretty stupid.

Mostly because of the "tabs signal intent" thing.

Imposing formatting on the programmer is dumb.
Certainly doesn't sound like it's your cup of tea, but there are a lot of creative and talented people doing good stuff with it. Artisans get to pick their tools. That's kinda how it works.

Arkhan Asylum

Quote from: guest on 06/24/2016, 05:54 PM
Quote from: Psycho Arkhan on 06/23/2016, 10:47 PMPython is cute for math simulations and piddly little things, but as a larger language, it's pretty stupid.

Mostly because of the "tabs signal intent" thing.

Imposing formatting on the programmer is dumb.
Certainly doesn't sound like it's your cup of tea, but there are a lot of creative and talented people doing good stuff with it. Artisans get to pick their tools. That's kinda how it works.
There's lots talented people making art out of garbage.  That's a thing.

;)

This "max-level forum psycho" (:lol:) destroyed TWO PC Engine groups in rage: one by Aaron Lambert on Facebook "Because Chris 'Shadowland' Runyon!," then the other by Aaron Nanto "Because Le NightWolve!" Him and PCE Aarons don't have a good track record together... Both times he blamed the Aarons in a "Look-what-you-made-us-do?!" manner, never himself nor his deranged, destructive, toxic turbo troll gang!

TurboXray

I'm not as critical of languages as most people, but I just didn't want to spend time with Python. Java has some use for me, so I'd rather have that in class so I spend less free time outside on my own getting to learn it. Otherwise I couldn't care less, personally.

 The reasons told to me for switching over to Python from Java, for the early part of the CS program, was that Java was too difficult for beginners. Apparently there's a high drop out rate of the CS program at this university because it's reportedly "too difficult". I'm guessing this complaint is part of the pre-major program classes.

 I don't think Java is difficult at all, and I've done a little bit of Python just out of curiosity, so I don't see that as a real issue. The problem I see is not syntax, but that the university doesn't offer any super-beginner courses for those that have never programmed before. This might sound odd, but there are students in my summer class that have never coded a day in their life. They're struggling in this class. I find that concept almost absurd.. why would you do a CS program with no experience? It's like me never playing an instrument but applying for the college of music.. type of mentality. I guess that's why there's a pre-major program in place to circumvent this and cater to these people (zero experience), but the initial classes don't adequately address this. I guess they think switching to Python is the solution.

 The problem I have with Python, is from what I've seen of what other programmers have stated - it's their first language and they're too afraid to move on something else. So they isolate themselves from other languages, etc. From reading many other programmers responses, that started with Java first, they had little issue moving onto something like C++ or C#, etc. I could be wrong on this, but I think Python is popular for the wrong reasons. And schools switching focus to cater to it, only facilitates this notion and culture. Schools should be focusing on concepts related to CS first and foremost, and a close second should be exposure to quite a few different languages. Force them to move outside their comfort zone. 

spenoza

Quote from: TurboXray on 06/25/2016, 12:24 AMwhy would you do a CS program with no experience? It's like me never playing an instrument but applying for the college of music.. type of mentality. I guess that's why there's a pre-major program in place to circumvent this and cater to these people (zero experience), but the initial classes don't adequately address this. I guess they think switching to Python is the solution.
With music, parents are willing to pay for private lessons, often. Not so with most other disciplines outside of sports. And since primary schools do such a poor job trying to deal with CS, colleges have to adapt. It's not a good situation for anyone.

I wish more colleges offered coding classes for non-majors. Something like Python or PHP with some JS and HTML could be useful in all sorts of different fields. But yeah, if you're going into a CS program, you should probably start with something a little more traditional, and if Python is called for add it later. Python is a great and easy to use language, but I don't know if I would consider it foundational the way I would something like Java.

OldMan

QuoteThe reasons told to me for switching over to Python from Java, for the early part of the CS program, was that Java was too difficult for beginners. Apparently there's a high drop out rate of the CS program at this university because it's reportedly "too difficult".
We need a spit-take emoji. I'd use it here.
Of the 60 or so people who -started- the CS classes when I did, over half went to the business major after the first year. And about 1/3 of those remaining changed their major.
I think there were ~15 of us who finished the whole cirriculum. I do have to say, I was taught more than a bunch of languages - I was taught to program no matter what the language.

QuoteThis might sound odd, but there are students in my summer class that have never coded a day in their life. They're struggling in this class.
Doesn't sound odd to me. I meet people like that all the time. They want the money programming pays. And after all, "How hard can it be?" <lol>
What the college really needs is to make them take something more than algebra to start the classes; something that makes you think logically, step-by-step. Math proofs the entire semester. And how to solve problems.

Arkhan Asylum

People at college lately do not learn how to program.  They learn how to write code.

and mostly learn how to like, do cookie cutter tasks, and that is it.

It should be a sink or swim setup.  Screw coddling people.  That's how you end up with tons of weak ass coworkers.
This "max-level forum psycho" (:lol:) destroyed TWO PC Engine groups in rage: one by Aaron Lambert on Facebook "Because Chris 'Shadowland' Runyon!," then the other by Aaron Nanto "Because Le NightWolve!" Him and PCE Aarons don't have a good track record together... Both times he blamed the Aarons in a "Look-what-you-made-us-do?!" manner, never himself nor his deranged, destructive, toxic turbo troll gang!

Gredler

Loving reading this thread, thanks for sharing the perspective and history stories.

I coded very rudimentary things as a kid in qbasic, and ZZT. I soon after learned html to make final fantasy and xenogears fan pages. The HTML knowledge floated me through college doing late 90s through mid 2000's style web pages, but my work kept leaning towards art more than programming. Programming was so much more math intensive, and difficult for me,  that I ended up focusing on art so much that it became my career.

Fast forward 10 years later and studios have access to hundreds of thousands of mind blowingly talented sculptors and painters to chose from. My skill set of " weak but above average for an artist technical understanding paired with crude  but sufficient artistic ability" is getting less competitive each day, and I need to plan the road ahead , so I started thinking furthering my technical abilities is probably the best idea.

When seeking advice on what to focus study on my leads all say Python Python Python. If I don't want to become an amazing sculptor, or paint like a master, than I will need to learn Python and HLSLPBR to handle HDR PBR results.

Python seems to be a required standard for technical artists, but I think knowing a lower level language trumps that requirement as it exemplifies a greater understanding of the craft.  Python is my current language of study, since they specifically asked me to learn it, but someday I'd like to try to learn asm or c, so I can homebrew solo :P

Regardless of writing code, programming, scripting, or creating shaders, I have to get better at math. I think it's impossible for me to get through these languages as an idiot with math (linear algebra is what I am told to become proficient in).

So I guess it's back to school for gredler within the next year or so. Kahn academy in the meantime

TurboXray

Quote from: Gredler on 06/25/2016, 10:51 PMRegardless of writing code, programming, scripting, or creating shaders, I have to get better at math. I think it's impossible for me to get through these languages as an idiot with math (linear algebra is what I am told to become proficient in).

So I guess it's back to school for gredler within the next year or so. Kahn academy in the meantime
I honestly don't know how strong the correlation is between math and just programming in general. Obviously some fields are more demanding in relation to higher level math. I see a fracture in the CS field as it is now (BA vs BS). I guess that's understandable given the wide range of fields programmers go.

spenoza

Quote from: TurboXray on 06/29/2016, 11:51 PM
Quote from: Gredler on 06/25/2016, 10:51 PMRegardless of writing code, programming, scripting, or creating shaders, I have to get better at math. I think it's impossible for me to get through these languages as an idiot with math (linear algebra is what I am told to become proficient in).

So I guess it's back to school for gredler within the next year or so. Kahn academy in the meantime
I honestly don't know how strong the correlation is between math and just programming in general. Obviously some fields are more demanding in relation to higher level math. I see a fracture in the CS field as it is now (BA vs BS). I guess that's understandable given the wide range of fields programmers go.
And in truth, while math is very important for specific kinds of coding and for logic, programming also has quite a lot in common with language, especially in how there's more than one way to do most tasks. So I think language acquisition and use skills are equally valuable.

TurboXray

spenoza: I wouldn't doubt it. I mean, we string together really abstract concepts on the fly when we speak. I can see the similarities with programming constructs. It's just that the syntax is more precise in computer programming languages vs human language. But it wouldn't surprise me if it used some the same parts of the brain.

Arkhan Asylum

a strong background in discrete math helps with programming.

a strong background in algebra, and basic geo/trigonometry helps for game programming.

This "max-level forum psycho" (:lol:) destroyed TWO PC Engine groups in rage: one by Aaron Lambert on Facebook "Because Chris 'Shadowland' Runyon!," then the other by Aaron Nanto "Because Le NightWolve!" Him and PCE Aarons don't have a good track record together... Both times he blamed the Aarons in a "Look-what-you-made-us-do?!" manner, never himself nor his deranged, destructive, toxic turbo troll gang!