Perhaps you right, they should not and that's why I propose new paradigm.
Nowadays assembelrs are turning 60 and I think its about time to invent a better bicycle.
eblc1388 said:
When one uses the name of a control bit to refer to it, and use the proper "include" file during assembly, why should the assembler complaints of any errors?
But, if one changes to for another processor, one would expect registers will not in the same memory space or containing the same bit meaning, until it is confirmed otherwise. Therefore a complete match without user intervention/change is a gift and not something to be taken for granted.
Sorry I still think sorting this out is the responsibility of the programmer other than by the assembler.
Or if you wish, the problem can be tackled via some #define statements or macro in an include file of your own, which you can then associate the correct register with the control bit name.
But, if one changes to for another processor, one would expect registers will not in the same memory space or containing the same bit meaning, until it is confirmed otherwise. Therefore a complete match without user intervention/change is a gift and not something to be taken for granted.
Sorry I still think sorting this out is the responsibility of the programmer other than by the assembler.
Yes, sure. But I still think that the development tool (I am not saying assembler) could help with this.
eblc1388 said:
Or if you wish, the problem can be tackled via some #define statements or macro in an include file of your own, which you can then associate the correct register with the control bit name.
Starting from scratch, to achieve incremental improvement, is either folly or hubris. I can't decide which. You're going to need better justification to generate any enthusiasm.
The problem with working on tools is that there is no economic incentive. In the beginning we create tools because there are none. Once they achieve a certain level of functionality we move on to the problems with an economic benefit.
People on this forum complain about the cost of tools. They are expensive because they are hard to develop, debug, support, and maintain. There are freeware tools out there but they are generally poor cousins to the commercial offerings.
And I have failed to implement interfaces because SDCC, PICC, and mikroPascal for PIC14 family do not generate proper code to work with function pointers.
Duh. You shouldn't use function pointers, it's near impossible to support them on PIC hardware.
PICs do not have a stack and use an Overlay methodology for reusing the same memory in different functions by evaluating the tree of what functions can be called from what places, and prohibiting recursive function calls.
Once you turn a function into a pointer, there is no longer any reliable way to decipher the possible calling tree (thus a lot of Overlay elements have to be made Static and the memory usage goes to hell) and protect against recursive calls.
The tool language (e.g. evolution of macro) will not be MCU specific and could be similar to JAVA.
The devcie language (instruction set, registers, perifery) will be (and must be) device specific.
The devcie language will be defined it terms of the tool language and that's the core idea of the proposed language.
Duh. You shouldn't use function pointers, it's near impossible to support them on PIC hardware.
PICs do not have a stack and use an Overlay methodology for reusing the same memory in different functions by evaluating the tree of what functions can be called from what places, and prohibiting recursive function calls.
Once you turn a function into a pointer, there is no longer any reliable way to decipher the possible calling tree (thus a lot of Overlay elements have to be made Static and the memory usage goes to hell) and protect against recursive calls.
Electronics is my hobby and some time I have no spare time for it.
Thus, when starting a project I cut it on rather short and isolated pieces which can be completed in short time and tested regardless of the others.
Papabravo said:
You're going to need better justification to generate any enthusiasm.
The problem with working on tools is that there is no economic incentive. In the beginning we create tools because there are none. Once they achieve a certain level of functionality we move on to the problems with an economic benefit.
People on this forum complain about the cost of tools. They are expensive because they are hard to develop, debug, support, and maintain. There are freeware tools out there but they are generally poor cousins to the commercial offerings.
But then you're writing in a completely different language, 100% processor specific - you're also working within the limitations of the hardware, a totally different situation to using a HLL.
Starting from scratch, to achieve incremental improvement, is either folly or hubris. I can't decide which. You're going to need better justification to generate any enthusiasm.
The problem with working on tools is that there is no economic incentive. In the beginning we create tools because there are none. Once they achieve a certain level of functionality we move on to the problems with an economic benefit.
People on this forum complain about the cost of tools. They are expensive because they are hard to develop, debug, support, and maintain. There are freeware tools out there but they are generally poor cousins to the commercial offerings.
Not yet. I made few experiments to prove myself that the mission is possible and that's it so far.
The first four steps in my roadmap are:
1. Refine vision
2. Establish requirements
3. Design a device metamodel
4. Make key design decisions on the language, define a protolanguage
And what I am really missing is deep knoweldge of other MCUs.
These are common starting points for every compiler design since the first FORTRAN and COBOL compilers from the late fifties and early sixties. Nothing new here and I'm with Nigel on the dubious nature of the claims. I hope you can prove me wrong but there is an awful lot of experience and knowledge wrapped up in our current tools. These are not stupid or incompetent people, they are giants in this field. The nature of the task is an epic undertaking.
These are common starting points for every compiler design since the first FORTRAN and COBOL compilers from the late fifties and early sixties. Nothing new here and I'm with Nigel on the dubious nature of the claims. I hope you can prove me wrong but there is an awful lot of experience and knowledge wrapped up in our current tools. These are not stupid or incompetent people, they are giants in this field. The nature of the task is an epic undertaking.
My believe is that defining MCU model in the same language as the program is written will open up possibilities hardly achievable in the current tools.
My believe is that defining MCU model in the same language as the program is written will open up possibilities hardly achievable in the current tools.
No disrespect, but that looks like 'marketing speak'?, and has no meaning in the real world.
As I see it, what you're wanting to do is write a language that's fully functional on any processor it's applied to? - so you can take a program from any processor and transfer it to any other, with no changes or limitations.
It's a nice idea, and it's perfectly possible, BUT it means crippling the language to the limitations of ALL the processors - this would make it a completely useless language, and rather than adding the parts you think current languages lack, it would do the complete opposite and simply remove them all together.
In any case, you don't appear to have even started such a compiler, nor even given any real thought to it? - and, to be honest, as you seem to want a language that completely 'spoon feeds you', I'm doubtful that you could write a compiler anyway?.
No disrespect, but that looks like 'marketing speak'?, and has no meaning in the real world.
As I see it, what you're wanting to do is write a language that's fully functional on any processor it's applied to? - so you can take a program from any processor and transfer it to any other, with no changes or limitations.
Not quite correct. If your program is written for a specific MCU and uses MCU features not available in other MCUs - you would have to make changes.
However, many parts in an embedded application (such as state machine, math, conversion) can be written for a 'more generic MCU' and these parts can be transfered to other MCU that fit under the same 'generality umbrella'
Nigel Goodwin said:
It's a nice idea, and it's perfectly possible, BUT it means crippling the language to the limitations of ALL the processors - this would make it a completely useless language, and rather than adding the parts you think current languages lack, it would do the complete opposite and simply remove them all together.
My idea is to give developers freedom to make such choise - either writing with 'limitations of ALL the processors' and have highly portable code or wrining for a given MCU and have efficient code.
The language itself will not bear any information on MCU, insead it will give a tool to define an MCU (some MCU or any MCU depenging how good the metamodel will be).
Nigel Goodwin said:
In any case, you don't appear to have even started such a compiler, nor even given any real thought to it? - and, to be honest, as you seem to want a language that completely 'spoon feeds you', I'm doubtful that you could write a compiler anyway?.
IMHO, writing compiler is not that great challange comparing to the desinging the language in the way it solves the tasks and remains clear and easy for undestanding and using. And this is the place were I need feedback from other developers. This kind of discussion straightens up my mind and motivates me finding better explanations to my idea.
Once langauge designed and its BNF established - writing compiler will be purely technic tasks. Years ago I've done it for some small sort-of-a-language data exchange formats.