Referenced file contains errors (http://www.springframework.org/schema/beans/spring-beans-Xxsd)







Eclipse中:`Window->>Preferences ->> General ->> Network Connections ->> Cache`,先勾选上`Disable caching`,然后点击`Remove All`。即可。

当然这些步骤做完可能还会提示有错误,这时候只要,`Project ->>Clean` ,就好了。

c:if 判断字符串相等




<c:if test="${param.cmd eq 'up'}">


<c:if test="${not empty param.cmd}"></c:if>


j2ee(servlet tomcat7) 运行opencv成功的例子

两种方法解决 java.lang.UnsatisfiedLinkError :opencv_java300 之类的错误。


因为之前都是在j2se上进行运用, 对于j2ee上怎么用opencv不是很熟悉,所以没办法,只能慢慢研究。


正常逻辑:加JAR包,添加NATIVE。但是在j2ee中不是,j2ee中需要在VM arguments中添加:







修改VM arguments



今天在研究的时候发现,实际上最后一个VM arguments可以不用添加,不过不能用






The call System.loadLibrary(name) is effectively equivalent to the call


OpenCV 3.0 Computer Vision with Java(PACKT,2015)

As the Internet gets more and more interactive, a subject of great interest is how to deal with image processing on the server side that enables you to create web applications dealing with OpenCV. As Java is among the languages of choice when developing web apps, this chapter shows the entire architecture of an application that lets users upload an image and add a fedora hat on top of detected faces using techniques learned throughout the book.

In this chapter, we will cover the following topics: 

• Setting up an OpenCV web application 

• Mixed reality 

• Image uploading 

• Dealing with HTTP requests

By the end of this chapter you will know how to create a complete web application with image processing, obtain input from the user, process the image on the server side, and return the processed image to the user. 

Setting up an OpenCV web application 


Since this chapter covers the development of a web application using Java OpenCV, it is important to address a couple of differences when going to the server side. The first thing is to tell the web container, generally Tomcat, Jetty, JBoss, or Websphere, about the location of native libraries. Other details deal with loading the native code. This should happen as soon as the web server goes up and should not occur again.

The advantages of using the web architecture are significant. As certain  image-processing tasks are compute intensive, they could easily drain the  device's battery in no time, so, taking them to a more robust hardware on the  cloud would relieve local processing. Besides that, there's no need for users to  install anything more than the web browser, and the updates happening on the server side are also very handy.

On the other hand, there are a few drawbacks. If, instead of hosting the web application on the administrator infrastructure, one intends to host it on Java servers online, it should be clear whether it allows native code to be run or not. At the time of writing, Google's App Engine does not allow it, but it is easy to set up a Linux server on Amazon EC2 or Google's Compute Engine that smoothly runs it although this won't be covered in this book. Another thing to be considered is that several computer vision applications need to be run in real time, even at the rate of 20 frames per second, for instance, which would be impractical in a web architecture, due to long upload times, and this type of application should be run locally. 

In order to create our web application, we will go through the following steps: 

  1. Creating a Maven-based web application.

  2. Adding OpenCV dependencies.

  3. Running the web application.

  4. Importing the project to Eclipse.

In the following sections, we will cover these steps in detail. 

Creating a Maven-based web application

There are several ways to create web applications in Java. Spring MVC, Apache Wicket, and Play Framework are all great options among others. Also, on top of these frameworks, we can put JavaServer Faces, PrimeFaces, or RichFaces as component-based user interfaces for these web applications. For this chapter though, instead of addressing all these technologies, the approach will be to only use servlets for you to choose your frameworks. You should notice that a servlet is simply a Java class used to extend the capabilities of a server, and this is generally used to process or store data that was submitted through an HTML form. The servlet API has been around since 1997, so it has been exhaustively used, and there are several books and samples about it. Although this chapter focuses on Servlet 2.x for simplicity, we need to be aware that the API is synchronous and that it might be better to use an asynchronous one, such as Servlet 3.x, for applications that will receive several clients together.

Although any IDE can easily generate a web application through a wizard—such as going to Eclipse and navigating to File | New | Project… | Web | Dynamic Web Project—we'll focus on starting it with the help of Maven since we can easily get native dependencies. As long as it has been installed correctly according to instructions in Chapter 1, Setting Up OpenCV for Java, Maven can set up a web application through the use of a prototype. This is achieved through the following command:

mvn archetype:generate -DgroupId=com.mycompany.app -DartifactId=my-webapp -DarchetypeArtifactId=maven-archetype-webapp -DinteractiveMode=false

This command will call the generate goal from the archetype plugin. Think of archetype as a project template. This Maven plugin will generate a web application from a template because we have set archetypeArtifactId as maven-archetypewebapp through the -DarchetypeArtifactId=maven-archetype-webapp option. The other option, DartifactId=my-webapp, will simply set the folder name of the web application as defined in this option, while groupId is Maven's universally unique identifier for a project. 

Note that the following structure will be created:


The preceding is a simple structure for a web application. You should pay attention to the web.xml file, which is used for mapping servlets, as well as index.jsp, which is a simple Java Server Page file. By now you should be able to run this web application in Tomcat, for instance, with little effort. Simply type the following command:

cd my-webapp 

mvn tomcat:run

Now, if the you access the address http://localhost:8080/my-webapp/, the following response should be seen in the browser:


Notice that it means that we have successfully created a web project, we are running it through a Tomcat web container, and it is available through localhost server, in port 8080, through the name my-webapp. The Hello World! message can be seen in index.jsp. In the following section, you are going to customize the pom file in order to add OpenCV dependencies. 

Adding OpenCV dependencies 

Since the web application archetype has created a project structure for us, we are going to add OpenCV dependencies for the generated pom.xml. If you open it,  you will see the following code:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <name>my-webapp Maven Webapp</name>

Notice that the only dependency is on junit. Now add the following to the dependencies tag:


The first two dependencies, opencvjar and opencvjar-runtime, are the same ones that have been discussed in Chapter 1, Setting Up OpenCV for Java. Now, the dependency on javax.servlet-api refers to the servlet API version 3.0.1, which is used to make files upload more easily. Besides using these dependencies, all other configurations are mentioned in Chapter 1, Setting Up OpenCV for Java, such as adding the JavaOpenCVBook repository, maven-jar-plugin, maven-dependency-plugin, and maven-nativedependencies-plugin.

The only new plugin is tomcat7 as we would require it to use the file upload API from servlet 3.0. In order to add the tomcat7 plugin, look for the <plugins> section in pom.xml and add the following code:


Besides adding the ability to run tomcat7 from Maven, it will also configure port 9090 as the default port for our server, but you can use another one. The final pom. xml file can be found in this chapter's source code project. Running an mvn package command will show that everything's been fine in the project setup. In the next section, we are going to check all the processes through a simple OpenCV call  from the .jsp file. 

Running the web application 

Now that all the dependencies have been set up, it should be straightforward to run our web application. One detail should be noticed, though. Since our application relies on native code, the opencv_java300.dll file, or the shared object, we should put it in the Java library path prior to running the Tomcat server. There are several approaches to doing this, depending on your deployment strategy, but a simple one could be setting the path through the MAVEN_OPTS environment variable. You should type the following command in the terminal:

set MAVEN_OPTS=-Djava.library.path=D:/your_path/my-webapp/target/natives

Please remember to change your_path to the place you are setting up your project, the parent folder of my-webapp. In order to check that the application server can correctly load OpenCV native libraries, we are going to set up a simple servlet which is able to output the correct installed version. Change the index.jsp file generated in your my-webapp\src\main\webapp folder to the following code:

	<h2>OpenCV Webapp Working!</h2>   
	 <%@ page import = "org.opencv.core.Core" %>    
	 Core.VERSION: <%= Core.VERSION %>  

Now, run your server typing mvn tomcat7:run. Try loading your application in your web browser at the address http://localhost:9090, and you should see the page outputting your loaded OpenCV version. Although this code doesn't really load native libraries, since Core.VERSION can be retrieved from pure Java JAR, it's not a good practice to mix business code—the one that really does your image processing—with your presentation code, that is, the Java Server Page we just edited. In order to deal with image processing, we are going to concentrate the code in a servlet that only deals with it. 

Importing the project to Eclipse

Now that the project is all set up with Maven, it should be easy to import it to Eclipse. Simply issue the following Maven command:

mvn eclipse:eclipse -Dwtpversion=2.0

Remember to add the -Dwtpversion=2.0 flag to add support for WTP version 2.0, which is Eclipse's Web Tools platform. If you have not set up your M2_REPO as explained in Chapter 1, Setting Up OpenCV for Java, a simple trick can automate it for you. Type the following command:

mvn -Declipse.workspace="YOUR_WORKSPACE_PATH" eclipse:configure-workspace

The YOUR_WORKSPACE_PATH path should be changed to something similar to  C:\Users\baggio\workspace if that is where your Eclipse workspace is located.

In Eclipse, navigate through File | Import | General | Existing Projects into the workspace and point to your my-webapp folder. Notice that your Eclipse should have WTP support. In case you receive a Java compiler level does not match the version of the installed Java project facet message, simply right-click it and in the Quick Fix menu, choose Change Java Project Facet version to Java 1.8. Now you can run it by right-clicking in your project, navigating to Run as | Run on Server, selecting Apache | Tomcat v7.0 Server, and hitting Next. If you don't have an existing Tomcat 7 installation, select Download and Install, as shown in the  next screenshot:


Select a folder for your Tomcat7 installation and click on Next and Finish. Now, you can run your application directly from Eclipse, by right-clicking on your project and clicking on Run as | Run on Server. In case you receive a "java.lang. UnsatisfiedLinkError: no opencv_java300 in java.library.path", right-click your project, "Run As ->Run Configurations..." and in the Arguments tab, in the VM arguments text box, add the -Djava.library.path="C:\path_to_your\target\natives". Click in "Apply" and restart your server by going to the Server tab and right-clicking your Tomcat7 execution -> Restart.

Mixed reality web applications

The web application we are going to develop draws Fedora hats on top of the detected heads in a given image. In order to do this, the user uploads the image through a simple form, and then it is converted to an OpenCV matrix in memory. After conversion, a cascade classifier looking for faces is run over the matrix. A simple scale and a translation are applied to estimate the hat's position and scale. A transparent fedora image is then drawn on the specified position for each of the detected faces. The result is then returned through HTTP by giving the mixed reality picture to the user. Notice that all the processing happens on the server side, so the client is only left to upload and download the image, which is very useful for clients that rely on batteries, such as smartphones.

Mixed reality (MR), sometimes referred to as hybrid reality (encompassing both augmented reality and augmented virtuality), refers to the merging of real and virtual worlds to produce new environments and visualisations where physical and digital objects co-exist and interact in real time. Not taking place only in the physical world or the virtual world, but a mix of reality and virtual reality, encompassing augmented reality and augmented virtuality. Source: Fleischmann, Monika; Strauss, Wolfgang (eds.) (2001). Proceedings of »CAST01//Living in Mixed Realities« Intl. Conf. On Communication of Art, Science and Technology, Fraunhofer IMK 2001, 401. ISSN 1618–1379 (Print), ISSN 1618–1387 (Internet)

This web application can be divided into a couple of simpler steps: 

  1. Image upload.

  2. Image processing.

  3. Response image. 

The following sections will cover these steps in detail.

Image upload

Firstly, we are going to turn our dummy Java Server Page into a form that will require the user to choose a local file, similar to the one seen in the following screenshot:


The following code shows the complete Java Server Page. Note the form element, which states that it will call a post method being processed in the doPost part of the servlet and requests that the web server to accept the data enclosed in the form for storage. The enctype= "multipart/form-data" states that no characters are going to be encoded, as can be seen in the "text/plain" encryption type, which converts spaces to + symbols. Another important attribute is action="upload". It makes sure that the data encoded in the form is sent to the "/upload" URL. The input element with the type "file" simply works as a call to the operating system's file dialog, which pops up and lets the user specify the file location. Finally, the input element with  the "submit" type deals with sending the request with form data when the button  is clicked:

<%@ page language="java" contentType="text/html; charset=ISO-8859-1"    pageEncoding="ISO-8859-1"%> 
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"    "http://www.w3.org/TR/html4/loose.dtd"> 
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> 
<title>File Upload</title> 
        <h1>File Upload</h1>
         <form method="post" action="upload"        
             Select file to upload: <input type="file" name="file" size="60" />
         <br />        
         <br /> 
         <input type="submit" value="Upload" />    

When pressing the Submit button, a stream of bytes is sent to the server, which will forward them to a servlet called Upload. Note that mapping from the /upload URL to the Upload servlet happens in the /src/main/webapp/WEB-INF/web.xml file, as shown in the following lines:


Pay attention to the fact that, when the user hits the Submit button from the form, the doPost method from the mapped servlet class, UploadServlet, is called. This method is the core of this web application, and we are going to see it in detail in the following code:

protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {  
    Mat image = receiveImage(request);  
    Mat overlay = loadOverlayImage();  
    detectFaceAndDrawHat(image, overlay);  
    writeResponse(response, image); 

The main action in the doPost method starts by loading the OpenCV library, as seen in the previous chapters, and then loading the cascade which will be used later for face detection. For the sake of brevity, the initialization is made here, but in actual code, you should use ServletContextListener in order to initialize it. Then, the receiveImage method deals with receiving bytes from the upload and converting it to an OpenCV matrix. So, the other methods take care of loading the fedora hat image and detecting people's faces so that the overlay can be drawn through the detectFaceAndDrawHat method. Finally, the writeResponse method answers the request. We will cover receiveImage in more detail in the following code:

private Mat receiveImage(HttpServletRequest request) throws IOException, ServletException { 
 byte[] encodedImage = receiveImageBytes(request);  
 return convertBytesToMatrix(encodedImage); 

Note that receiveImage simply grabs bytes from an upload request in receiveImageBytes and then converts them to a matrix. The following  is the code for receiveImageBytes:

private byte[] receiveImageBytes(HttpServletRequest request) throws IOException, ServletException {  
    InputStream is = (InputStream) request.getPart("file"). getInputStream();  
    BufferedInputStream bin = new BufferedInputStream(is);    
    ByteArrayOutputStream buffer = new ByteArrayOutputStream();    
    int ch =0;  while((ch=bin.read())!=-1) {      
    byte[] encodedImage = buffer.toByteArray();  
    return encodedImage; 

This is the default code to receive an upload. It accesses the "file" field from the form and gets its stream through request.getPart("file").getInputStream(). Then, a buffer is created, so all data from the input stream is written through the write() method as long as there's data from the upload. The byte array is then returned through the ByteArrayOutputStream class's toByteArray() method. Since what we have received at this point is just a bunch of bytes, there is a need to decode the image format and convert it to an OpenCV matrix. Fortunately, there's already a method that does that, imdecode, from the Imgcodecs package, the signature of which is as follows:

public static Mat imdecode(Mat buf, int flags)

The buf argument is a Mat buffer that we will create from the byte array, and  flags is an option used to convert the Mat buffer returned to grayscale or color,  for instance. 

The complete code for the decoding can be seen in the following lines:

private Mat convertBytesToMatrix(byte[] encodedImage) {  
    Mat encodedMat = new Mat(encodedImage.length,1,CvType.CV_8U);  
    encodedMat.put(0, 0,encodedImage);  
    Mat image = Imgcodecs.imdecode(encodedMat, Imgcodecs.CV_LOAD_IMAGE_ ANYCOLOR);  
    return image; 

Now it's done, we have received the user's image upload, and it is converted to our well-known Mat class. It's now time to create the mixed reality.