Call for Papers -- Privacy-Preserving Techniques for Data Collection and Analysis in Ambient Intelligence Networks

2024-08-23

Privacy preserving technologies let marketers continue to use data driven systems while giving users the ability to safeguard the privacy of personally identifiable information they give to and are managed by service providers or apps. Statistical analysis has been made possible by privacy preserving methods including homomorphic encryption, SMC, and differential privacy. Finding a balance between data privacy and statistical accuracy is one of the difficulties in privacy preserving computation. They include methods that allow data analysis and sharing without jeopardising the security and integrity of the data, such as homomorphic encryption, federated learning, differential privacy, and safe multiparty computation. Theoretically, firms can retain and increase access to important data while protecting individuals’ privacy thanks to privacy preserving analytics approaches. Maintaining privacy is crucial when handling sensitive data inside a company. It is made up of techniques and procedures intended to shield data during processing and analysis from unwanted access, use, or disclosure. The goal of privacy preservation processing techniques is to obscure or even completely remove the connection between sensitive data and its original owner without impairing the data ability to offer insightful information about a particular occurrence of interest.

Data can be encoded using encryption to ensure that only authorised users can decipher it. But encryption is not enough to protect information on its own. Data integrity, encryption, auditing, and access control are all used to protect data in databases. The ubiquitous access to multiple equipment and devices on service providers is the aim of the Internet of Things. IoT based gadgets, yet, can expose a user to different security and privacy risks. The main goal of privacy preservation strategies is to safeguard data transfers of any kind between parties. Data privacy is the capacity for individuals to control that can access their personal information and the safeguarding of personal data from unauthorised parties. A large machine learning model has the potential to memorise the training set, which is risky for privacy. Measurement of privacy loss and control over data access are necessary for maintaining privacy. Because of its mathematical accuracy, differential privacy is usually regarded as the gold standard of privacy protection.

In general, privacy refers to the freedom from interruption or intrusion and the right to be left alone. The right to some control over the collection and use of your personal information is termed as information privacy. Ensuring that sensitive data, such financial or medical records, is only accessed by authorised individuals is one example of data privacy. Biometric authentication or access control methods like usernames and passwords can be used to accomplish this. Another example of data privacy is data encryption. Concerns about gathering, storing, and keeping data, as well as data transfers that fall under relevant rules and legislation like GDPR and HIPAA, are the main emphasis of data privacy. Throughout the data lifecycle, data security refers to safeguarding data from loss, corruption, and unauthorised access. It would be challenging for the individual to recognize the mistakes in their thinking. We welcome submissions from a variety of fields and viewpoints, such as but not limited to: Privacy-Preserving Techniques for Data Collection and Analysis in Ambient Intelligence Networks.

The topics relevant to this special issue include but are not limited to:

  • A thorough examination of privacy preserving technologies created for social media websites.
  • Effective third-party auditing that protects privacy for ambient intelligence systems.
  • Raw data collecting for IoT that protects privacy without the need for a reliable authority.
  • Predicting human behaviour with non-intrusive, privacy preserving internet of things aided surveillance.
  • A cloud environment data privacy preserving model based on deep neural networks and a differential methodology.
  • Software defined Internet of Things fog intrusion detection framework with privacy preserving features.
  • A sanitization strategy for data mining in a socially dispersed context while protecting privacy.
  • Green computing tools for privacy aware data aggregation in IoT based healthcare.
  • Mobile sensing using cloud enabled privacy preserving collaborative learning.
  • A user authentication system for wireless sensor networks that protects privacy and can be verified, based on internet of things security.
  • An anonymous plan for collecting data in IoT based healthcare services systems while protecting privacy.
  • Human computer connection that is both private and secure while acknowledging widespread healthcare surveillance.

 

Guest Editors:
Dr. Mehdi Gheisari (Managing GE), Department of Computer Science, Islamic Azad University, Tehran, Iran
Dr. Subhendu Kumar Pani (First Co-GE), Krupajal Engineering College, Biju Patnaik University of Technology, India
Dr. Kamalakanta Muduli (Second Co-GE), Department of Mechanical Engineering, The Papua New Guinea University of Technology, Papua New Guinea

 

Special Issue Timeline:
Submission Deadline: January 25, 2025
Author Notification: March 30, 2025
Revised Submission: May 31, 2025
Final Acceptance: July 25, 2025