apache

Kafka Connect HDFS

Deadly 提交于 2021-02-07 06:50:47
概述 Kafka 的数据如何传输到HDFS?如果仔细思考,会发现这个问题并不简单。 不妨先想一下这两个问题? 1)为什么要将Kafka的数据传输到HDFS上? 2)为什么不直接写HDFS而要通过Kafka? HDFS一直以来是为离线数据的存储和计算设计的,因此对实时事件数据的写入并不友好,而Kafka生来就是为实时数据设计的,但是数据在Kafka上无法使用离线计算框架来作批量离线分析。 那么,Kafka为什么就不能支持批量离线分析呢?想象我们将Kafka的数据按天拆分topic,并建足够多的分区,然后通过Spark-Streaming,Flink,又或者是KSql等来处理单个topic中的所有数据--这就相当于处理某一天的所有数据。这种计算的性能从原理上来说是不比Spark或者Hive离线计算差的。 而且更好的是,这样我们就不用将kafka中的数据翻来覆去的导到hdfs,而是直接在kafka上作计算。 后面我们将对此展开更多的讨论,这里先回归正题,在常见的大数据系统架构(lambda)中,通常会将kafka中的数据导入到HDFS来作离线的数据分析。在Kafka的官方wiki中提到了这样的一些方式来对接Hadoop生态。 https://cwiki.apache.org/confluence/display/KAFKA/Ecosystem 其中最常用的是Flume

lbmethod_heartbeat:notice - No slotmem from mod_heartmonitor --error after installing apache2.4.2 [closed]

回眸只為那壹抹淺笑 提交于 2021-02-07 06:08:30
问题 Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 4 years ago . Improve this question On starting the webserver the error_log shows [Wed May 16 03:48:53.027372 2012] [lbmethod_heartbeat:notice] [pid 1114:tid 3075385024] AH02282: No slotmem from mod_heartmonitor [Wed May 16 03:48:53.028312 2012] [mpm_event:notice] [pid 1114:tid 3075385024] AH00489:

Apache using all 16 GB Memory, how to limit its processes and memory usage?

情到浓时终转凉″ 提交于 2021-02-07 04:32:25
问题 We are on 16GB AWS instance and I am finding it to be really slow. When I ran ps -aux | grep apache I can see about 60+ apache processes. When I ran watch -n 1 "echo -n 'Apache Processes: ' && ps -C apache2 --no-headers | wc -l && free -m" It is showing almost all memory being used by apache. When I ran curl -L https://raw.githubusercontent.com/richardforth/apache2buddy/master/apache2buddy.pl | perl to see how to optimize Apache, it suggested me to increase number of MaxRequestWorkers so I

Spark学习之路 (十五)SparkCore的源码解读(一)启动脚本

做~自己de王妃 提交于 2021-02-07 04:03:28
一、启动脚本分析 独立部署模式下,主要由master和slaves组成,master可以利用zk实现高可用性,其driver,work,app等信息可以持久化到zk上;slaves由一台至多台主机构成。Driver通过向Master申请资源获取运行环境。 启动master和slaves主要是执行/usr/dahua/spark/sbin目录下的start-master.sh和start-slaves.sh,或者执行 start-all.sh,其中star-all.sh本质上就是调用start-master.sh和start-slaves.sh 1.1 start-all.sh #1.判断SPARK_HOME是否有值,没有将其设置为当前文件所在目录的上级目录 if [ -z "${SPARK_HOME}" ]; then export SPARK_HOME ="$(cd "`dirname "$0"`"/..; pwd)" fi # 2.执行${SPARK_HOME}/sbin/spark- config.sh,见以下分析 . "${SPARK_HOME}/sbin/spark-config.sh" #3.执行"${SPARK_HOME}/sbin"/start-master.sh,见以下分析 "${SPARK_HOME}/sbin"/start- master.sh # 4.执行"

Is it possible to run a Node script from a web page?

梦想的初衷 提交于 2021-02-07 03:24:36
问题 I'am searching for days now but could not get an answer. I would like to do the following: User connects to editor.html (Apache2 with basic http auth) User want to open a file (lets say /home/user1/myfile.txt) on the server with his user/pass (same as in passwd) Node.js Script gets startet with user rights from above and user can edit file The Node Script will handle the connection via websockets and read/writes files. I think the biggest problem is that its not possible to run a node script

Apache Tomcat 8 not working. Throws HTTP Status 500 - java.lang.ClassNotFoundException: org.apache.jsp.index_jsp

删除回忆录丶 提交于 2021-02-07 02:59:29
问题 I am using Apache Tomcat 8 and I've JDK 1.7. Tomcat starts running after I run "startup.bat". But when I try to run " http://localhost:8080/ ", it shows an error: " HTTP Status 500 - java.lang.ClassNotFoundException: org.apache.jsp.index_jsp " Please help me to fix this. Click here to see the screenshot 回答1: IT'S WORKING! What I did: Opened command prompt using "Run as administrator" Went to the "bin" directory of Tomcat folder. cd C:\Program Files\apache-tomcat-8.0.3\bin' Entered 'startup'

Apache Tomcat 8 not working. Throws HTTP Status 500 - java.lang.ClassNotFoundException: org.apache.jsp.index_jsp

倾然丶 夕夏残阳落幕 提交于 2021-02-07 02:58:42
问题 I am using Apache Tomcat 8 and I've JDK 1.7. Tomcat starts running after I run "startup.bat". But when I try to run " http://localhost:8080/ ", it shows an error: " HTTP Status 500 - java.lang.ClassNotFoundException: org.apache.jsp.index_jsp " Please help me to fix this. Click here to see the screenshot 回答1: IT'S WORKING! What I did: Opened command prompt using "Run as administrator" Went to the "bin" directory of Tomcat folder. cd C:\Program Files\apache-tomcat-8.0.3\bin' Entered 'startup'

Spark学习之路 (十五)SparkCore的源码解读(一)启动脚本

|▌冷眼眸甩不掉的悲伤 提交于 2021-02-07 00:24:44
讨论QQ:1586558083 目录 一、启动脚本分析 1.1 start-all.sh 1.2 start-master.sh 1.3 spark-config.sh(1.2的第5步) 1.4 load-spark-env.sh(1.2的第6步) 1.5 spark-env.sh 1.6 spark-daemon.sh 1.7 spark-class 1.8 start-slaves.sh 1.9 转向start-slave.sh 二、其他脚本 2.1 start-history-server.sh 2.2 start-shuffle-service.sh 2.3 start-thriftserver.sh 正文 回到顶部 一、启动脚本分析 独立部署模式下,主要由master和slaves组成,master可以利用zk实现高可用性,其driver,work,app等信息可以持久化到zk上;slaves由一台至多台主机构成。Driver通过向Master申请资源获取运行环境。 启动master和slaves主要是执行/usr/dahua/spark/sbin目录下的start-master.sh和start-slaves.sh,或者执行 start-all.sh,其中star-all.sh本质上就是调用start-master.sh和start-slaves.sh 1.1

Setting up virtual host on XAMPP

亡梦爱人 提交于 2021-02-07 00:00:58
问题 I've installed XAMPP on Ubuntu in '/opt/lampp' directory and would like to set up some VirtualHosts. Apache Virtual Host tutorial instructs to place <VirtualHost *:80> ... </VirtualHost> code in '/etc/apache2/sites-available/[virtualhostname].conf'. The problem is that I don't have 'apache2' folder in '/etc'. I also don't have 'sites-available' directory in '/opt/lampp/apache2'. I have '/opt/lampp/etc/httpd.conf' and '/opt/lampp/etc/extra/httpd-vhosts.conf' files though. Which ones shall I

Setting up virtual host on XAMPP

故事扮演 提交于 2021-02-07 00:00:17
问题 I've installed XAMPP on Ubuntu in '/opt/lampp' directory and would like to set up some VirtualHosts. Apache Virtual Host tutorial instructs to place <VirtualHost *:80> ... </VirtualHost> code in '/etc/apache2/sites-available/[virtualhostname].conf'. The problem is that I don't have 'apache2' folder in '/etc'. I also don't have 'sites-available' directory in '/opt/lampp/apache2'. I have '/opt/lampp/etc/httpd.conf' and '/opt/lampp/etc/extra/httpd-vhosts.conf' files though. Which ones shall I