Interview Questions

I thought that using large model meant that I could use more than 64K of data!

C Interview Questions and Answers


(Continued from previous question...)

I thought that using large model meant that I could use more than 64K of data!

Q: What does the error message ``DGROUP data allocation exceeds 64K'' mean, and what can I do about it? I thought that using large model meant that I could use more than 64K of data!

A: Even in large memory models, MS-DOS compilers apparently toss certain data (strings, some initialized global or static variables) into a default data segment, and it's this segment that is overflowing. Either use less global data, or, if you're already limiting yourself to reasonable amounts (and if the problem is due to something like the number of strings), you may be able to coax the compiler into not using the default data segment for so much. Some compilers place only ``small'' data objects in the default data segment, and give you a way (e.g. the /Gt option under Microsoft compilers) to configure the threshold for ``small.''

(Continued on next question...)

Other Interview Questions