Hi,
I have a large code (more than 500MB)is there any way to expand the MSP430 flash memory?thanks
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
I have a large code (more than 500MB)is there any way to expand the MSP430 flash memory?thanks
The MSP has no externally available memory bus (and also only 16/20 bit address bus). SO no, there i sno way to extend the MSP memory.
However, the MSp can communicate with alarge number of external devices which may have storage space. For example SD cards. However, accessing the data requires a peripheral data transfer. This memory is not directly accessible like ram/main flash.
I successfully used a 1GB SD card for logging data with almost 2MB/s throughput (for data read, slower for write)
No, you cannot execute code in external memory. At best, you can use existing code in internal memory, copy code from external memory into internal memory, and then execute the copy.
Look at the summary page for TI's microcontroller products: http://www.ti.com/lsds/ti/microcontroller/home.page
The top of the MSP430 range has 256KB - you are over three orders of magnitude outside this range!!
Even the top of the entire microcontroller range has only 1MB.
Whatever it is you're considering is obviously waaaaaaaayyy outside the scope of a microcontroller!!
As already noted, for that kind of code size, you need to be looking at a high-end microprocessor:
Perhaps ARM Cortex-A: http://www.ti.com/lsds/ti/arm/overview.page
Or, maybe, OMAP: http://www.ti.com/general/docs/gencontent.tsp?contentId=46946
What, exactly, is it that you're trying to do that requires so much code?
It's difficult to make appropriate suggestions without knowing your goal!
I am sampling an analog signal and for each sample the code will set the GPIOS high and low according to the sample value, I was able to use MSP430 with 4KB flash to represent maximum of 150-160 samples,the code was as follow:
P1OUT=0X0B; %sample1
P2OUT=0X10; %sample1
__delay_cycles (310); %wait the period of sample frequency
P1OUT=0X10; %sample2
P2OUT=0X14; %sample2
__delay_cycles (310); %wait the period of sample frequency
.
.
.%continue up to 150 sample
Now I want to sample at 50kHz for ten minutes and if you do the math, the flash size needed may reach 800MB.
My options are to expand the flash or interface an sd card where I save the samples then make the code read the sample and then set the GPIOs.My concern here is that reading from sd is slower than reading from flash.
Thanks
The code you provide is not sampling of analog signal but actually periodic setting of output port values most inefficient way one can imagine. If I were you, I would store data (for gods sake not code!) in SD card as data array, read it from SDcard using SPI peripheral, use timer peripheral for sample timing. Code will be very compact, your 4KB msp430 will do it easily. To save ten minutes of 16 bit samples 2*50000*60*10= 60megabytes needed.
I am curious - what's the application? Why you need so high "sample" rate? I suppose you have 16bit DAC attached to P1 and P2?
So you're not talking about code at all - ie, instructions for the CPU to execute - you are talking about data!!
Yes, you can certainly add external data storage; eg, using SPI or I2C connection.
You could connect an SD Card via SPI, or there are plenty of large data storage memory chips with SPI/I2C interfaces...
The sampling is done by high speed daq and the code above is generared automatically by C program . The MSP430 is just try to do something for each sample.The application is related to sample mobile devices battery current ,some times a 250KHz high speed daq is used . Any way I know now what should I do thanks for support!
Wow, on 250kHz sampling rate, you have 4 CPU clock cycles per MHZ CPU speed adn sample. on 25MHz CPU speed (the fastest MSP) this is 100 CPU clock cycles for data transfer from DAC to MSP and for 'doing something'. Including data storage on external memory. Not much headroom.Islam Saleh1 said:a 250KHz high speed daq is used
Andy Neil said:I have a large code (more than 500MB)Where did that code come from?
[/quote]
Now we know that it is data, not code. But well, if you wonder how large masses of code can be produced, I can tell you a story.
About 12 years ago, when I was wokring for NewDeal on the GEOS operating system, there was a deal with Canon to bundle one of their printers with a PC and GEOS. Canon was asked to write a printer driver for the printer. Their first printer driver was over 6MB in size. Now the whole GEOS operating system including all applications, drivers (including drivers for many other printers) demo documents, fonts etc. was <20MB. A typical printer driver was below 10k.
The folder with the sources of this memory hog was named '666'.
At second attempt they produced a 600kB driver. Still larger than word processor, spreadsheet and vector paint application together. And much larger than any other printer driver.
That's what happens when people try to port windows bloatware to a resource-limited system.
Jens-Michael Gross said:That's what happens when people try to port windows bloatware to a resource-limited system.
That's unfortunately true. Else we could run todays software of last decades PCs without need for buying a new PC every year. Sometimes, if I see what tremendous amounts of 'minimum requirements' are listed for rather simple tasks, I can barely keep me from vomiting.Ilmars said:If you try to create software for feature and resource-rich systems using same approach as for resource-limited systems, you will not deliver (in time / before competition).
However, many people who are used to this method of "saving development time at the customers expense" do believe this tradeoff is still possible when there is a hard limit on the resources.
If you find a copy of 'kkrieger.exe', try it out. These guys were putting a whole 3D ego shooter inclusifn sound into a tiny 96k (kilobytes!) exe file. No other data or code (except for DirectX). Including all textures and the whole game engine. An extreme example, but it shows what could be possible.
Jens-Michael Gross said:However, many people who are used to this method of "saving development time at the customers expense" do believe this tradeoff is still possible when there is a hard limit on the resources.
Well... world of hard limits is mass-production world. If someone try to save on development here - fire him.
This would definitely lead to better products - and significantly more unemployed people.Ilmars said:Well... world of hard limits is mass-production world. If someone try to save on development here - fire him.
Unfortunately, often those people are successful in telling their bosses that 'things cannot be done this way' and at the end the product contains a PC with bloatware on it. Expensive, slow, power-eating, but a relief for the unskilled coder.
Two weeks ago I was in a local supermarket and the scale for the vegetables wasn't working. Someone came and did reset it and it took whopping 6 minutes to boot Windows and start the UI. I'm sure it was a very expensive device.
Jens-Michael Gross said:Expensive, slow, power-eating, but a relief for the unskilled coder.
If your task would be to engineer networked "high-end" supermarket scale with connection to SQL database, containing local printer but with remote label printers option (from multiple leading manufacturers), having easily customizable touch panel able to display product images. Expected sales - couple of thousand units per year, Time to market - let's say, one year. Would you develop such a product same way as wireless shoe sensors?
As a customer, I would expect it to work as seamless. Why does a $.25 shoe sensor work without compromise and a $2500 scale does not?Ilmars said:Would you develop such a product same way as wireless shoe sensors?
The problem is how careless things are developed. It's not necessary to squeeze the hardware to the max, but just a bit less waste of resources.
I remember a PC SF game where there was an alien race which regularly (= every few million years) harvested the life energy of a whole galaxy (there are enough of them) as 'food'.
The game ended with a a few small adjustments to the energy collecting device, so the same amount of energy could be harvested without killing or even hurting a single being. The alien race just didn't care for improvements because there were so many resources (= galaxies) available. Whoever wrote the game concept - I bet he also noticed the trend in software development :)
**Attention** This is a public forum