Blogger Tips and TricksLatest Tips For BloggersBlogger Tricks

Wednesday 23 December 2015

Algorithm helps turn smartphones into 3-D scanners

Algorithm helps turn smartphones into 3-D scanners:


                 While 3-D printers have become relatively cheap and available, 3-D scanners have lagged well behind. But now, an algorithm developed by Brown University researchers my help bring high-quality 3-D scanning capability to off-the-shelf digital cameras and smartphones.




              "One of the things my lab has been focusing on is getting 3-D  from relatively low-cost components," said Gabriel Taubin, a professor in Brown's School of Engineering. "The 3-D scanners on the market today are either very expensive, or are unable to do high-resolution image capture, so they can't be used for applications where details are important."
               Most high-quality 3-D scanners capture using a technique known as structured light. A projector casts a series of light patterns on an object, while a  captures images of the object. The ways in which those patterns deform over and around an object can be used to render a 3-D image. But for the technique to work, the  projector and the camera have to precisely synchronized, which requires specialized and expensive hardware.
               The algorithm Taubin and his students have developed, however, enables the structured light technique to be done without synchronization between projector and camera, which means an off-the-shelf camera can be used with an untethered structured light flash. The camera just needs to have the ability to capture uncompressed images in burst mode (several successive frames per second), which many DSLR cameras and smartphones can do.
              The researchers presented a paper describing the algorithm last month at the SIGGRAPH Asia computer graphics conference.
             The problem in trying to capture 3-D images without synchronization is that the projector could switch from one pattern to the next while the image is in the process of being exposed. As a result, the captured images are mixtures of two or more patterns. A second problem is that most modern digital cameras use a rolling shutter mechanism. Rather than capturing the whole image in one snapshot, cameras scan the field either vertically or horizontally, sending the image to the camera's memory one pixel row at a time. As a result, parts of the image are captured a slightly different times, which also can lead to mixed patterns.
            "That's the main problem we're dealing with," said Daniel Moreno, a graduate student who led the development of the algorithm. "We can't use an image that has a mixture of patterns. So with the algorithm, we can synthesize images—one for every pattern projected—as if we had a system in which the pattern and image capture were synchronized."
After the camera captures a burst of images, algorithm calibrates the timing of the image sequence using the binary information embedded in the projected pattern. Then it goes through the images, pixel by pixel, to assemble a new sequence of images that captures each pattern in its entirety. Once the complete pattern images are assembled, a standard structured light 3D reconstruction algorithm can be used to create a single 3-D image of the object or space.
In their SIGGRAPH paper, the researchers showed that the technique works just as well as synchronized structured light systems. During testing, the researchers used a fairly standard structured light projector, but team envisions working to develop a structured light flash that could eventually be used as an attachment to any camera, now that there's an  that can properly assemble the images.
"We think this could be a significant step in making precise and accurate 3-D scanning cheaper and more accessible," Taubin said.

Monday 12 October 2015

GATE syllabus for cse 2016


GATE syllabus for cse 2016

Section1: Engineering Mathematics Discrete Mathematics: Propositional and first order logic. Sets, relations, functions, partial orders and lattices. Groups. Graphs: connectivity, matching, coloring. Combinatorics: counting, recurrence relations, generating functions. Linear Algebra: Matrices, determinants, system of linear equations, eigenvalues and eigenvectors, LU decomposition. Calculus: Limits, continuity and differentiability. Maxima and minima. Mean value theorem. Integration. Probability: Random variables. Uniform, normal, exponential, poisson and binomial distributions. Mean, median, mode and standard deviation. Conditional probability and Bayes theorem. Computer Science and Information Technology

Section 2: Digital Logic Boolean algebra. Combinational and sequential circuits. Minimization. Number representations and computer arithmetic (fixed and floating point).

Section 3: Computer Organization and Architecture Machine instructions and addressing modes. ALU, data‐path and control unit. Instruction pipelining. Memory hierarchy: cache, main memory and secondary storage; I/O interface (interrupt and DMA mode).

  Section 4: Programming and Data Structures Programming in C. Recursion. Arrays, stacks, queues, linked lists, trees, binary search trees, binary heaps, graphs.

Section 5: Algorithms Searching, sorting, hashing. Asymptotic worst case time and space complexity. Algorithm design techniques: greedy, dynamic programming and divide‐and‐conquer. Graph search, minimum spanning trees, shortest paths.

Section 6: Theory of Computation
Regular expressions and finite automata. Context-free grammars and push-down automata. Regular and contex-free languages, pumping lemma. Turing machines and undecidability. 

Section 7: Compiler Design Lexical analysis, parsing, syntax-directed translation. Runtime environments. Intermediate code generation.

 Section 8: Operating System Processes, threads, inter‐process communication, concurrency and synchronization. Deadlock. CPU scheduling. Memory management and virtual memory. File systems.

Section 9: Databases ER‐model. Relational model: relational algebra, tuple calculus, SQL. Integrity constraints, normal forms. File organization, indexing (e.g., B and B+ trees). Transactions and concurrency control

Section 10: Computer Networks Concept of layering. LAN technologies (Ethernet). Flow and error control techniques, switching. IPv4/IPv6, routers and routing algorithms (distance vector, link state). TCP/UDP and sockets, congestion control. Application layer protocols (DNS, SMTP, POP, FTP, HTTP). Basics of Wi-Fi. Network security: authentication, basics of public key and private key cryptography, digital signatures and certificates, firewalls. 

Monday 17 August 2015

A Project Report ON ONLINE APTITUDE GAME



A Project Report
ON
ONLINE APTITUDE GAME



Submitted To:                                                                 
         In 22  Lab                                                                           

 Submitted By:

B.ObuliRaj      
G.RenceAbishek
S.Dinesh
INDEX
1.  Acknowledgement
2. Introduction
              2.1   Purpose
2.2   Scope
3.Software Development Methodology
4.Hardware  Requirement
5.Software  Requirement
6.Software  Specification
7.ER-Diagram
8.Product functions

1.Acknowledgement

We gratefully acknowledge for the assistance, cooperation, guidance and clarifications provided by In22 Lab during the development of the Online Aptitude Game website. Our extreme gratitude to A.S.PRABAHARAN (AP/CSE) and C.A.SREERAM (Director/Talent Development cell) who guided us throughout the project . Without his willing disposition, spirit of accommodation, frankness, timely clarification and above all faith in us, this project could not have been completed in due time.

His readiness to discuss all important matters at work deserves special attention. We would also like to thank whole of the faculty of the college for their cooperation and important support.
2.Introduction

Online Aptitude Game is being launched because a need for a destination that is beneficial for both institutes and students. With this site, institutes can register and host online exams. Students can give exams and view their results. This site is an attempt to remove the existing flaws in the manual system of conducting exams.

2.1 Purpose

Online Aptitude Game fulfills the requirements of the institutes to conduct the exams online. They do not have to go to any software developer to make a separate site for being able to conduct exams online. They just have to register on the site and enter the exam details and the lists of the students which can appear in the exam.
Students can give exam without the need of going to any physical destination. They can view the result at the same time. Thus the purpose of the site is to provide a system that saves the efforts and time of both the institutes and the students.

2.2 Scope

Online Aptitude Game System is a web application that establishes a network between the institutes and the students. Institutes enter on the site the questions they want in the exam. These questions are displayed as a test to the eligible students. The answers enter by the students are then evaluated and their score is calculated and saved. This score then can be accessed by the institutes to determine the passes students or to evaluate their performance. Online Exams System provides the platform but does not directly participate in, nor is it involved in any tests conducted. Questions are posted not by the site, but users of the site.

3.Software Development Methodology

The establishment and use of sound engineering principles in order to obtain economically developed software that is reliable and works efficiently on real machines is called software engineering.

Software Engineering
                              It is the discipline whose aim is:
1. Production of quality software
2. software that is delivered on time
3. cost within the budget
4. satisfies all requirements.

Software process is the way in which we produce the software. Apart from hiring smart, knowledgeable engineers and buying the latest development tools, effective software development process is also needed, so that engineers can systematically use the best technical
and managerial practices to successfully complete their projects.

A software life cycle is the series of identifiable stages that a software product undergoes during its lifetime .A software lifecycle model is a descriptive and diagrammatic representation of the software life cycle .A life cycle model represents all the activities required to make a software
product transit through its lifecycle phases .It also captures the order in which these activities are to be taken .
Life Cycle Models
There are various life cycle models to improve the software processes.
           “WATERFALL MODEL”
WATERFALL MODEL


This model contains 5 phases:

Requirement analysis and specification
The goal of this phase is to understand the exact requirements of the
customer and to document them properly.(SRS)
Design
The goal of this phase is to transform the requirement specification into a structure that is suitable for implementation in some programming language.

Implementation
During this phase the design is implemented. Initially small modules are tested in isolation from rest of the software product.

Testing
In this all the modules is tested altogether.

Maintenance
Release of software inaugurates the operation and life cycle phase of the operation.

    4.Hardware  Requirement
Component
Minimum requirement
Processor
32-bit, Intel Pentium
RAM
2 GB for developer or evaluation use
Hard disk
16 GB for production use 80 GB
Keyboard
104-key Windows keyboards
  5.Software  Requirement
Type
Minimum requirement
Operating System
Windows xp
Browser
Firefox


6.Software  Specification
Back End:
            Net Beans IDE  7.3.1
Server:
            Apache Tome cat 7.0.34
Front End:
            Adobe Dream weaver CS5   

7.ER-Diagram
ER Diagram for online Aptitude Exams system


Entity–relationship model (ER model) is a data model for describing the data or
information aspects  of a business domain or its process requirements, in an abstract way that lends itself to ultimately being  implemented in a data base such as a relational database. The main components of ER models are entities  (things) and the relationships that can exist among them.














8.Product functions
Online Aptitude Exams Project Function
1.Login
2.Register
3.Record the data in database
4.Home page of the project
5.Execrise 
6.Test
        6.1.Random test
        6.2.Topic wise test
7.Check test marks
8.Data base process
9.Show the results & status
10.Sign out
Coding Language:
  The front end is design by the language of “ HTML5 /CSS /SCPRIT”.
    The back end is designed by the language of “JSP”.
    The used server id apache TAM CAT

Saturday 11 July 2015

Neuroscience-based algorithms make for better networks

Neuroscience-based algorithms make for better networks

When it comes to developing efficient, robust networks, the brain may often know best. Researchers from Carnegie Mellon University and the Salk Institute for Biological Studies have, for the first time, determined the rate at which the developing brain eliminates unneeded connections between neurons during early childhood.
Neurons create networks through a process called pruning. At birth and throughout , the brain's neurons make a vast number of connections—more than the brain needs. As the brain matures and learns, it begins to quickly prune away connections that aren't being used. When the brain reaches adulthood, it has about 50 to 60 percent less synaptic connections than it had at its peak in childhood.
In sharp contrast, computer science and engineering networks are often optimized using the opposite approach. These networks initially contain a small number of connections and then add more connections as needed.
"Engineered networks are built by adding connections rather than removing them. You would think that developing a network using a pruning process would be wasteful," said Ziv Bar-Joseph, associate professor in Carnegie Mellon's Machine Learning and Computational Biology departments. "But as we showed, there are cases where such a process can prove beneficial for engineering as well."

 

The researchers first determined key aspects of the pruning process by counting the number of synapses present in a mouse model's somatosensory cortex over time. After counting synapses in more than 10,000 electron microscopy images, they found that synapses were rapidly pruned early in development, and then as time progressed, the pruning rate slowed.
The results of these experiments allowed the team to develop an algorithm for designing computational networks based on the brain pruning approach. Using simulations and theoretical analysis they found that the neuroscience-based algorithm produced networks were much more efficient and robust than the current engineering methods.
In the networks created with pruning, the flow of information was more direct, and provided multiple paths for information to reach the same endpoint, which minimized the risk of network failure.
"We took this high-level algorithm that explains how neural structures are built during development and used that to inspire an algorithm for an engineered network," said Alison Barth, professor in Carnegie Mellon's Department of Biological Sciences and member of the university's BrainHubSM initiative. "It turns out that this neuroscience-based approach could offer something new for computer scientists and engineers to think about as they build networks."
As a test of how the algorithm could be used outside of neuroscience, Saket Navlakha, assistant professor at the Salk Institute's Center for Integrative Biology and a former postdoctoral researcher in Carnegie Mellon's Machine Learning Department, applied the algorithm to flight data from the U.S. Department of Transportation. He found that the synaptic pruning-based algorithm created the most efficient and robust routes to allow passengers to reach their destinations.
"We realize that it wouldn't be cost effective to apply this to networks that require significant infrastructure, like railways or pipelines," Navlakha said. "But for those that don't, like wireless networks and sensor networks, this could be a valuable adaptive method to guide the formation of networks."
In addition, the researchers say the work has implications for neuroscience. Barth believes that the change in pruning rates from adolescence to adulthood could indicate that there are different biochemical mechanisms that underlie pruning.
"Algorithmic neuroscience is an approach to identify and use the rules that structure brain function," Barth said. "There's a lot that the brain can teach us about computing, and a lot that computer science can do to help us understand how neural networks function."
As the birthplace of artificial intelligence and cognitive psychology, Carnegie Mellon has been a leader in the study of brain and behavior for more than 50 years. The university has created some of the first cognitive tutors, helped to develop the Jeopardy-winning Watson, founded a groundbreaking doctoral program in neural computation, and completed cutting-edge work in understanding the genetics of autism. Building on its strengths in biology, , psychology, statistics and engineering, CMU recently launched BrainHubSM, a global initiative that focuses on how the structure and activity of the give rise to complex behaviors.


Read more at: http://phys.org/news/2015-07-neuroscience-based-algorithms-networks.html#jCp
The researchers first determined key aspects of the pruning process by counting the number of synapses present in a mouse model's somatosensory cortex over time. After counting synapses in more than 10,000 electron microscopy images, they found that synapses were rapidly pruned early in development, and then as time progressed, the pruning rate slowed.
The results of these experiments allowed the team to develop an algorithm for designing computational networks based on the brain pruning approach. Using simulations and theoretical analysis they found that the neuroscience-based algorithm produced networks were much more efficient and robust than the current engineering methods.
In the networks created with pruning, the flow of information was more direct, and provided multiple paths for information to reach the same endpoint, which minimized the risk of network failure.
"We took this high-level algorithm that explains how neural structures are built during development and used that to inspire an algorithm for an engineered network," said Alison Barth, professor in Carnegie Mellon's Department of Biological Sciences and member of the university's BrainHubSM initiative. "It turns out that this neuroscience-based approach could offer something new for computer scientists and engineers to think about as they build networks."
As a test of how the algorithm could be used outside of neuroscience, Saket Navlakha, assistant professor at the Salk Institute's Center for Integrative Biology and a former postdoctoral researcher in Carnegie Mellon's Machine Learning Department, applied the algorithm to flight data from the U.S. Department of Transportation. He found that the synaptic pruning-based algorithm created the most efficient and robust routes to allow passengers to reach their destinations.
"We realize that it wouldn't be cost effective to apply this to networks that require significant infrastructure, like railways or pipelines," Navlakha said. "But for those that don't, like wireless networks and sensor networks, this could be a valuable adaptive method to guide the formation of networks."
In addition, the researchers say the work has implications for neuroscience. Barth believes that the change in pruning rates from adolescence to adulthood could indicate that there are different biochemical mechanisms that underlie pruning.
"Algorithmic neuroscience is an approach to identify and use the rules that structure brain function," Barth said. "There's a lot that the brain can teach us about computing, and a lot that computer science can do to help us understand how neural networks function."
As the birthplace of artificial intelligence and cognitive psychology, Carnegie Mellon has been a leader in the study of brain and behavior for more than 50 years. The university has created some of the first cognitive tutors, helped to develop the Jeopardy-winning Watson, founded a groundbreaking doctoral program in neural computation, and completed cutting-edge work in understanding the genetics of autism. Building on its strengths in biology, , psychology, statistics and engineering, CMU recently launched BrainHubSM, a global initiative that focuses on how the structure and activity of the give rise to complex behaviors.


Read more at: http://phys.org/news/2015-07-neuroscience-based-algorithms-networks.html#jCp
The researchers first determined key aspects of the pruning process by counting the number of synapses present in a mouse model's somatosensory cortex over time. After counting synapses in more than 10,000 electron microscopy images, they found that synapses were rapidly pruned early in development, and then as time progressed, the pruning rate slowed.
The results of these experiments allowed the team to develop an algorithm for designing computational networks based on the brain pruning approach. Using simulations and theoretical analysis they found that the neuroscience-based algorithm produced networks were much more efficient and robust than the current engineering methods.
In the networks created with pruning, the flow of information was more direct, and provided multiple paths for information to reach the same endpoint, which minimized the risk of network failure.
"We took this high-level algorithm that explains how neural structures are built during development and used that to inspire an algorithm for an engineered network," said Alison Barth, professor in Carnegie Mellon's Department of Biological Sciences and member of the university's BrainHubSM initiative. "It turns out that this neuroscience-based approach could offer something new for computer scientists and engineers to think about as they build networks."
As a test of how the algorithm could be used outside of neuroscience, Saket Navlakha, assistant professor at the Salk Institute's Center for Integrative Biology and a former postdoctoral researcher in Carnegie Mellon's Machine Learning Department, applied the algorithm to flight data from the U.S. Department of Transportation. He found that the synaptic pruning-based algorithm created the most efficient and robust routes to allow passengers to reach their destinations.
"We realize that it wouldn't be cost effective to apply this to networks that require significant infrastructure, like railways or pipelines," Navlakha said. "But for those that don't, like wireless networks and sensor networks, this could be a valuable adaptive method to guide the formation of networks."
In addition, the researchers say the work has implications for neuroscience. Barth believes that the change in pruning rates from adolescence to adulthood could indicate that there are different biochemical mechanisms that underlie pruning.
"Algorithmic neuroscience is an approach to identify and use the rules that structure brain function," Barth said. "There's a lot that the brain can teach us about computing, and a lot that computer science can do to help us understand how neural networks function."
As the birthplace of artificial intelligence and cognitive psychology, Carnegie Mellon has been a leader in the study of brain and behavior for more than 50 years. The university has created some of the first cognitive tutors, helped to develop the Jeopardy-winning Watson, founded a groundbreaking doctoral program in neural computation, and completed cutting-edge work in understanding the genetics of autism. Building on its strengths in biology, , psychology, statistics and engineering, CMU recently launched BrainHubSM, a global initiative that focuses on how the structure and activity of the give rise to complex behaviors.


Read more at: http://phys.org/news/2015-07-neuroscience-based-algorithms-networks.html#jCp
The researchers first determined key aspects of the pruning process by counting the number of synapses present in a mouse model's somatosensory cortex over time. After counting synapses in more than 10,000 electron microscopy images, they found that synapses were rapidly pruned early in development, and then as time progressed, the pruning rate slowed.
The results of these experiments allowed the team to develop an algorithm for designing computational networks based on the brain pruning approach. Using simulations and theoretical analysis they found that the neuroscience-based algorithm produced networks were much more efficient and robust than the current engineering methods.
In the networks created with pruning, the flow of information was more direct, and provided multiple paths for information to reach the same endpoint, which minimized the risk of network failure.
"We took this high-level algorithm that explains how neural structures are built during development and used that to inspire an algorithm for an engineered network," said Alison Barth, professor in Carnegie Mellon's Department of Biological Sciences and member of the university's BrainHubSM initiative. "It turns out that this neuroscience-based approach could offer something new for computer scientists and engineers to think about as they build networks."
As a test of how the algorithm could be used outside of neuroscience, Saket Navlakha, assistant professor at the Salk Institute's Center for Integrative Biology and a former postdoctoral researcher in Carnegie Mellon's Machine Learning Department, applied the algorithm to flight data from the U.S. Department of Transportation. He found that the synaptic pruning-based algorithm created the most efficient and robust routes to allow passengers to reach their destinations.
"We realize that it wouldn't be cost effective to apply this to networks that require significant infrastructure, like railways or pipelines," Navlakha said. "But for those that don't, like wireless networks and sensor networks, this could be a valuable adaptive method to guide the formation of networks."
In addition, the researchers say the work has implications for neuroscience. Barth believes that the change in pruning rates from adolescence to adulthood could indicate that there are different biochemical mechanisms that underlie pruning.
"Algorithmic neuroscience is an approach to identify and use the rules that structure brain function," Barth said. "There's a lot that the brain can teach us about computing, and a lot that computer science can do to help us understand how neural networks function."
As the birthplace of artificial intelligence and cognitive psychology, Carnegie Mellon has been a leader in the study of brain and behavior for more than 50 years. The university has created some of the first cognitive tutors, helped to develop the Jeopardy-winning Watson, founded a groundbreaking doctoral program in neural computation, and completed cutting-edge work in understanding the genetics of autism. Building on its strengths in biology, , psychology, statistics and engineering, CMU recently launched BrainHubSM, a global initiative that focuses on how the structure and activity of the give rise to complex behaviors.


Read more at: http://phys.org/news/2015-07-neuroscience-based-algorithms-networks.html#jCp