Author Topic: [SOLVED] No attempt to write core file  (Read 7222 times)

Offline onesteplower

  • Multiple posting newcomer
  • *
  • Posts: 12
[SOLVED] No attempt to write core file
« on: October 10, 2012, 08:00:26 pm »
Came back to edit one more time to say thank you to oBFusCATed and jens. Code::Blocks is awesome, and so are you!

Solution:

sudo cat /proc/sys/kernel/core_pattern returned |/usr/share/apport/apport %p %s %c

apport was not installed.  Reinstalling apport fixed this problem, but raises another curious issue of Ubuntu attempting to send my source to Canonical every time its core is dumped, not to mention an annoying popup that is displayed until it is sent.  So, seems I need to play with the core_pattern a bit.  It's not that I don't want to share source, but that I'd rather share it when it's ready.

The post at the bottom of this page includes a program whose output shows whether a file or core limit issue is causing the problem, and attempts to correct it and make a core dump if it is.  Code blocks specifically, ensure that the working directory of the project is in a place where the user has read and write permissions (Project -> Properties -> Build Target -> Execution Working Directory).  Scattered throughout this thread should be enough tips to cure whatever ails future readers (hopefully).

The original post is below:

--------------------------------------------------------------------

Code::Blocks suddenly stopped writing core files.

* It is not a ulimit issue -- I know to use ulimit -c unlimited  
* It is not a permissions issue -- This is the case even when I run a program as root  
* It is not a code issue -- if I #incude signal.h and call abort (), there is still no core file  
* The message "Core Dumped" does not happen  

I've fixed a segfault using valgrind without the core file, so it's not crippling -- just inconvenient.  So far, I've tried

* Removing and reinstalling gcc  
* Removing and reinstalling Code::Blocks  
* Changing to the nightly builds  

If there's a change that precipitated this, then I don't know what it was.  Perhaps an update to Ubuntu?  Searching Google leads me to pages where people seem to express the same issue, but then they just get told to use ulimit -c unlimited.  As mentioned above, that's not the issue here.

As an aside, since switching to the nightly build, the preference to fold all blocks on startup does nothing and delimiters don't auto-complete (typing a '(', for example, gives me a '(' instead of "()").  Finally, I read that the nightly fixes the problem of debugging a console application failing to find its target console, but it didn't.  

Ubuntu 12.04

Code::Blocks
Build: Oct 7 2012, 22:35:11 - wx2.8.12 (Linux, unicode) - 32 bit
Version: svn build rev 8438
SDK Version: 1.13.9

gcc version 4.6.3

g++ -pedantic -Wall -g -fexceptions -O0 -std=c++0x -Winit-self -Wredundant-decls -Wcast-align -Wfloat-equal

This occurs with and without each of these compiler options, so ... I already ruled those out, but they're included for good measure.  These are the options that were being used when the problem began.  gcc 4.7 is installed, but since I can't figure out how to use it, Code::Blocks uses 4.6.3 (unless the switch happened without me doing anything? gcc -v still shows me 4.6.3).
« Last Edit: October 10, 2012, 10:01:37 pm by onesteplower »

Offline oBFusCATed

  • Developer
  • Lives here!
  • *****
  • Posts: 13413
    • Travis build status
Re: No attempt to write core file
« Reply #1 on: October 10, 2012, 08:18:11 pm »
Corefile writing has nothing to do with codeblocks it is an OS/kernel feature, so you'll have to setup you kernel/os to generate them.
Have you tried to use "int *a=NULL; *a=5;", to see if a core file will be generated?

Finally, I read that the nightly fixes the problem of debugging a console application failing to find its target console, but it didn't.  
Details, steps to reproduce?

(most of the time I ignore long posts)
[strangers don't send me private messages, I'll ignore them; post a topic in the forum, but first read the rules!]

Offline onesteplower

  • Multiple posting newcomer
  • *
  • Posts: 12
Re: No attempt to write core file
« Reply #2 on: October 10, 2012, 08:31:44 pm »
Using int *a=NULL; *a=5; "Segmentation fault," but no "core dumped".

Regarding the other problem, "Warning: GDB failed to set the controlling terminal.  Operation not permitted" in one step: start the debugger.  I've read that this is a known bug, but also that it is fixed in the current nightly build.  I only mentioned it because it is not fixed in the nightly from 7 October for Ubuntu.  I can't read assembly, so that doesn't affect me much.  I just noticed it when I got bored and figured I'd try to learn to use it, and I mention it just to be helpful.

Regarding the core file, I'm still learning.  I actually thought that would be the case, since it's a feature common to all IDEs, but there is nada out there about this other than to use ulimit -c unlimited.  I've scoured probably close to fifty or so threads on it, so I'm sorry that I couldn't cite them all but I've seen where people tried to explain that wasn't the problem only to have it repeated to them as if it's the only problem that could happen.  So, I guess I'm on my own with that.

I wonder if it could be related to Unity.  A couple days after this problem started, I did have to reset Unity because it started randomly logging me out...  That just ended up to be related to nVidia drivers, so it would be surprising if the two are related.  Shouldn't a purge of gcc and re-installation fix that?
« Last Edit: October 10, 2012, 08:33:46 pm by onesteplower »

Offline Jenna

  • Administrator
  • Lives here!
  • *****
  • Posts: 7255
Re: No attempt to write core file
« Reply #3 on: October 10, 2012, 08:46:35 pm »
Regarding the other problem, "Warning: GDB failed to set the controlling terminal.  Operation not permitted" in one step: start the debugger.  I've read that this is a known bug, but also that it is fixed in the current nightly build.  I only mentioned it because it is not fixed in the nightly from 7 October for Ubuntu.  I can't read assembly, so that doesn't affect me much.  I just noticed it when I got bored and figured I'd try to learn to use it, and I mention it just to be helpful.

This is not a C::B, but a gdb bug, and it's just a warning and can be ignored.

The bug you mentioned was, that terminals that do not have an uppercase "T" in the commandline (like xterm has) have not been found by the debugger and therefore the debugging output could not be parsed (e.g. gnome-terminal).
This bug is fixed in trunk.

Offline Jenna

  • Administrator
  • Lives here!
  • *****
  • Posts: 7255
Re: No attempt to write core file
« Reply #4 on: October 10, 2012, 09:09:10 pm »
Regarding the core file, I'm still learning.  I actually thought that would be the case, since it's a feature common to all IDEs, but there is nada out there about this other than to use ulimit -c unlimited.  I've scoured probably close to fifty or so threads on it, so I'm sorry that I couldn't cite them all but I've seen where people tried to explain that wasn't the problem only to have it repeated to them as if it's the only problem that could happen.  So, I guess I'm on my own with that.

I wonder if it could be related to Unity.  A couple days after this problem started, I did have to reset Unity because it started randomly logging me out...  That just ended up to be related to nVidia drivers, so it would be surprising if the two are related.  Shouldn't a purge of gcc and re-installation fix that?
The core dump has nothing to do with the IDE.
Don't forget C::B is "just" an IDE not the compiler nor the system.

"ulimit -c unlimited" does not mean you have unlimited resources. The limit is the current hard limit, which can only be changed by root.
 

By the way executable needs to have read access and you should run cat /proc/sys/kernel/core_pattern to see whether the default template to create a coredump has changed.

Offline onesteplower

  • Multiple posting newcomer
  • *
  • Posts: 12
Re: No attempt to write core file
« Reply #5 on: October 10, 2012, 09:27:52 pm »

This is not a C::B, but a gdb bug, and it's just a warning and can be ignored.

The bug you mentioned was, that terminals that do not have an uppercase "T" in the commandline (like xterm has) have not been found by the debugger and therefore the debugging output could not be parsed (e.g. gnome-terminal).
This bug is fixed in trunk.

Awesome!  So I can start learning to use it!

The core dump has nothing to do with the IDE.
Don't forget C::B is "just" an IDE not the compiler nor the system.

"ulimit -c unlimited" does not mean you have unlimited resources. The limit is the current hard limit, which can only be changed by root.
 

By the way executable needs to have read access and you should run cat /proc/sys/kernel/core_pattern to see whether the default template to create a coredump has changed.


I'll update here for others who have this issue (even though it's not a Code::Blocks issue).  I figure others may come here with a similar problem, so I'll edit progress into this post until it's solved, and then edit the solution up top.  I'm going to check cat now.

This may be a compound issue, based on the following program and its output:

#include "sys/resource.h"
#include "sys/time.h"
#include <iostream>

int main ( )
{
    struct rlimit limits;

    getrlimit (RLIMIT_CORE, &limits);        // Get core file limits
    std::cout << limits.rlim_cur << "\n";  // output current core file limit

    limits.rlim_cur = limits.rlim_max;      // Set current core file limit to hard limit in struct
    setrlimit (RLIMIT_CORE, &limits );      // update current core file limit

    getrlimit (RLIMIT_CORE, &limits);       // Get core file limits (to be sure the set worked)
    std::cout << limits.rlim_cur << "\n"; // Output current core file limit

    getrlimit (RLIMIT_FSIZE, &limits);       // Get file size limits
    std::cout << limits.rlim_cur << "\n"; // output current file size limit

    limits.rlim_cur = limits.rlim_max;      // Set current file size limit to hard limit in struct
    setrlimit (RLIMIT_FSIZE, &limits );      // update current file size limits

    getrlimit (RLIMIT_FSIZE, &limits);       // Get file size limits (to be sure the set worked)
    std::cout << limits.rlim_cur << "\n"; // output current file size limit

    int *a=NULL; *a=5;                             // Cause a segfault

    return 1;
}

/*

0
4294967295
4294967295
4294967295
Segmentation fault

*/

So, the limit on the core file is 0 starting off, but the hard limit isn't.  Changing that does not resolve the problem, so I suspect something in the kernel itself is causing this.

Going by this: http://manpages.ubuntu.com/manpages/lucid/man5/core.5.html

* The directory is writeable (double checked working directory)
* The program has read and write permissions
* I own the program and directory (So shouldn't have had to use root before the problem, but I did! A hint?)
* The core file doesn't already exist
* The file system is not full
* It's not a ulimit issue
* This post ruled out RLIMIT_CORE and RLIMIT_FSIZE

So, I've just ruled out all of Canonical's troubleshooting possibilities.  I'd like to report this as a bug, but I really don't know what to include to reproduce this problem because I don't know what precipitated it!  Just a sudden change, out of the blue.  
« Last Edit: October 10, 2012, 09:29:35 pm by onesteplower »