segmentation-fault

Segmentation fault with ulimit set correctly

ぃ、小莉子 提交于 2019-12-12 10:04:40
问题 I tried to help an OP on this question. I found out that a code like the one below causes segmentation fault randomly even if the stack is set to 2000 Kbytes. int main () { int a[510000]; a[509999] = 1; printf("%d", a[509999]); return 0; } As you can see the array is 510000 x 4 bytes = 2040000 bytes. The stack is set to 2000 Kbytes (2048000 bytes) using ulimit command: ulimit -s 2000 ulimit -Ss 2000 Based on those numbers the application has room to store the array, but randomly it return

PHP doesn't handle stack overflow?

若如初见. 提交于 2019-12-12 09:43:55
问题 I was surprised when I just tried the following PHP code: function foo() { foo(); } foo(); I expected to get "500: Internal server error". Instead the connection was closed immediately (no bytes received), and the log files show that apache segfaulted. WTF? Is this a known bug in PHP? Are there some configuration options that I'm missing? Because a crashed process for every accidental stack overflow is, well... pretty unacceptable, I think. 回答1: PHP is not able to deal with this, it will just

printf of char* gets Segmentation Fault

耗尽温柔 提交于 2019-12-12 06:26:17
问题 I'm Trying to read from a socket and print to stdout using printf (a must); However I get a Segmentation Fault every time I read a specific file (an HTML) from the sane web site. Please, take a look at this code and tell me what wrong. int total_read = 0; char* read_buff = malloc(BUF_SIZE); char* response_data = NULL; if (read_buff == NULL){ perror("malloc"); exit(1); } while((nbytes = read(fd, read_buff, BUF_SIZE)) > 0){ int former_total = total_read; total_read += nbytes; response_data =

all exec related PHP functions segfault

喜欢而已 提交于 2019-12-12 06:02:41
问题 I build PHP for android using the NDK. Most functions tested so far run perfectly, except... All exec related php function (like exec, shell_exec, popen, ..) all sefgault. php sample code (test.php) <?php $s=shell_exec("ls"); echo $s ?> result: # php test.php [1] Segmentation fault I added some debug code to the internal shell_exec function PHP_FUNCTION(shell_exec) { FILE *in; size_t total_readbytes; zval **cmd; char *ret; php_stream *stream; if (ZEND_NUM_ARGS()!=1 || zend_get_parameters_ex(1

PHP in Commandline - Segmentation fault (core dumped) - Debug while running phpindexer.php in Magento

早过忘川 提交于 2019-12-12 05:50:10
问题 I'm running a script in commandline. It runs for about 5 minutes and then returns Segmentation fault (core dumped) The script is a Magento reindexing script. Found in /shell for those familiar with the platform Command line script running is php indexer.php --reindex catalog_url It just throws Segmentation fault (core dumped) - I don't know where to look for any more info than that? 回答1: It seems like the script is running out of the memory. Magento native UrlRewrite indexer is quite slow and

Why do I get a seg fault? I want to put a char array pointer inside a struct

百般思念 提交于 2019-12-12 05:48:14
问题 consider the fallowing code: typedef struct port * pport; struct port { int a; int b; pport next; pport prev; char * port; }; void addNewport(pport head) { pport newPort = (pport)malloc(sizeof(pport*)); newPort->prev=temp; head->next=newPort; } int main() { pport head = (pport)malloc(sizeof(pport*)); addNewport(head); } This will result in seg fault if try to add a new port via a subroutine, but if I perform it the main, no seg fault will appear. Why is that? 回答1: Replace malloc(sizeof(pport*

defining shared_ptr causes segfault (CMake)

China☆狼群 提交于 2019-12-12 05:28:21
问题 While setting up a new project (using CMake, compiler is gcc version 5.2.1, ubuntu (15.10)), I wanted to use a shared_ptr. This simple main.cpp works fine: #include <iostream> #include <memory> using namespace std; int main() { cout<<"Hi there!"<<endl; return 0; } But just defining a shared_ptr will cause the program to crash with segfault before even writing "Hi there!". #include <iostream> #include <memory> using namespace std; int main() { cout<<"Hi there!"<<endl; shared_ptr<double> test;

segmentation fault accessing a private class variable

陌路散爱 提交于 2019-12-12 05:13:32
问题 #define TABLE_SIZE 100489 // must be a power of 2 typedef map<string,int> MAP_TYPE; typedef pair<string, int> PAIR_TYPE; class HashTable { public: //public functions HashTable(); ~HashTable(); int find(string); bool insert(string, int); private: int hash( const char* ); vector< MAP_TYPE > v; }; //HashTable constructor HashTable::HashTable() { vector< MAP_TYPE > v; //initialize vector of maps v.reserve(TABLE_SIZE); //reserve table size MAP_TYPE m; for (int i = 0; i < TABLE_SIZE; ++i) //fill

Segmentation Fault in Loop's Condition

自作多情 提交于 2019-12-12 05:11:07
问题 The following code is to sort a linked list after creating it. The sorting algorithm used is somewhat similar to Bubble Sort. I am checking the two consecutive nodes and swapping them if necessary. I used the debugger which told me that the fault is raised while condition checking for the loops which are used while sorting. #include<iostream> #include<stdio.h> #include<stdlib.h> #include<string.h> #include<conio.h> using namespace std; struct link_list { char value[20]; struct link_list *next

Fortran/C Mixing : How to access dynamically allocated C array in Fortran?

百般思念 提交于 2019-12-12 04:59:03
问题 I'm currently experiencing an memory issue: I have a main program coded in Fortran which calls a C/C++ subroutine to perform some tasks and store data in a dynamically allocated array. The thing is I need to have access to these data when back to the Fortran main program. I tried to declare a C pointer (TYPE(C_PTR)) in fortran to point to the array but it doesn't seem to work. The array is present within the C subroutine but I get a segfault when trying to access it when I'm back to the main