|VOLUME 14, ISSUE 1, 1 October, 2018 To 31 December, 2020|
|VOLUME 14, ISSUE 1, 1 November, 2018 To 31 December, 2020|
|VOLUME 13, ISSUE 1, 1 April, 2017 To 30 June, 2017|
|VOLUME 12, ISSUE 1, 1 January, 2017 To 31 March, 2017|
|VOLUME 11, ISSUE 1, 16 October, 2016 To 31 December, 2016|
|VOLUME 9, ISSUE 1, 15 April, 2016 To 14 June, 2016|
|VOLUME 8, ISSUE 1, 15 January, 2016 To 14 April, 2016|
|VOLUME 7, ISSUE 1, 15 October, 2015 To 14 January, 2016|
|VOLUME 6, ISSUE 1, 15 July, 2015 To 14 October, 2015|
|VOLUME 5, ISSUE 1, 16 April, 2015 To 15 July, 2015|
|VOLUME 4, ISSUE 1, 16 January, 2015 To 15 April, 2015|
|VOLUME 2, ISSUE 1, 16 August, 2014 To 15 November, 2014|
|VOLUME 1, ISSUE 1, 15 June, 2014 To 15 August, 2014|
Image processing has been topics of research from past and researchers have set mile stones in image processing. The major challenges in image processing has been to detect the particular object in images, matching images, handling distorted images along with the accuracy of the image processing algorithms and their performance. Recent innovations in machine learning has opened new prospects in image processing which is not only helpful in increasing the accuracy and performance of image processing but have been used to reduce the complexities in implementation of complexity of the algorithms quickly. In this work a survey of these techniques is being proposed to measure and list some of the latest techniques and algorithms using machine learning techniques in image processing.
Virtual Machine migration has been a challenge in the cloud computing and cloud environment from the very first implementations of the cloud. The users may need to create thousands of the virtual machines and apply unlimited loads on them. This has facilitated the users to great extent by pay per use and optimal resource utilization for what they pay. But along with cloud server managers are facing the same advantages as challenges as the server on which load is increased has to be eased by migrating some or more virtual machines to other servers which are respectively vacant. This not only requires migrating without shutdown challenge but also requires evaluating such virtual machines proactively. For this proactive evaluation many researchers have proposed various algorithms and still require continuous researches for enhancing the performance of such systems. In this paper, such methods are being discussed and the challenges in them are being shown.
The Internet of Things (IoT) is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. IoT has interconnections through the physical, cyber and social spaces. Most of devices among them are resource constrained. During the interaction between devices, IoT gets suffered from severe security challenges. Security of resource constrained networks becomes prime important. Many existing mechanisms give security and protection to networks and systems but they are unable to give fine grain access control. In this work, focus is on enhancing the performance of the IoT system with high security and least usage of the resources on the constrained devices i.e. the load related with security is kept on the servers which are high resource oriented. Performance of CoAP based framework is enhanced and compared with existing security CoAP implementations. Test results shall be compared for communication overhead and authentication delays.
With the addition of more and more devices on IoT, security threats are increasing in same proportion, which is a great challenge for the researchers and users of the IoT. To make it sustainable many new algorithms are being applied to prevent security threats over IoT. Since the major security threats are same as been in the past for new devices as well therefore the possible solutions are also similar. But it is required to match the characteristics of the different devices being applied in IoT. Therefore a keen eye is required to be kept on both the aspects during application of security algorithms. In this paper our focus is on discussing the security challenges, their solutions and possible improvements. The next work to it will be focused on providing a better solution from the findings of this research paper. Our focus will be not only of prevention of attacks pro-actively but will also be on providing the solution with high accuracy and performance of the system as IoT requires high performance to be maintained for better results.
In current era IoT is becoming a need for mankind and has interconnections through the physical, cyber and social spaces. Even with the enhanced technology, most of devices over IoT are resource constrained. Not only resources are the causing the problem but security over the IoT is also severe. Many existing mechanisms give security and protection to networks and systems but they are unable to give fine grain access control. From our previous work, we have proposed algorithm for the performance of the IoT system with high security and least usage of the resources i.e. the load related with security is kept on the servers are few. A new simulated environment has been created to test the security and performance of the proposed algorithm and the results obtained there from have been discussed in this work. Test results have been compared for communication overhead and authentication delays.
Machine Learning (ML) is being used in many fields in current era. It is majorly a self-learning mechanism which is initiated by the developers from some initial data inputs viz. training set. The algorithm used in ML will automatically learn from the new data appearing to it. The algorithms are taken from computational mathematics which makes ML to be very useful for the system it is being used into. Due to its self-learning capacity and applicability in different areas, ML applications are increasing day by day. This research is emphasizing on latest trends of technology used in ML along with their applications. The research will have focus from early techniques to new trends of ML techniques e.g. Deep Learning (DL) and others. Paper also includes ML applications in Artificial Intelligence (AI) to cover the complete set.