[Coral_current] [IKDD News] 1Million Deepfakes Detection Challenge @ ACM Multimedia (fwd)

Prakhar Dixit pdixit1 at umbc.edu
Mon Mar 11 15:51:14 EDT 2024


It is indeed an interesting one.

Thanks
Prakhar



On Mon, Mar 11, 2024 at 3:41 PM Tim Oates <oates at cs.umbc.edu> wrote:

> An interesting competition.
>
>
> ---------------------------------------
> Tim Oates, Professor
> Department of CS and EE
> University of Maryland Baltimore County
> (410) 455-3082
> https://coral-lab.umbc.edu/oates/
>
> ---------- Forwarded message ----------
> Date: Thu, 7 Mar 2024 04:29:41 -0800 (PST)
> From: Abhinav Dhall <dhallabhinav at gmail.com>
> To: IKDD News <ikdd-news at googlegroups.com>
> Subject: [IKDD News] 1Million Deepfakes Detection Challenge @ ACM
> Multimedia
>
> Call for Participation - 1 Million Deepfakes Detection
> Challengehttps://deepfakes1m.github.io/
> at ACM Multimedia 2024, Melbourne
>
> The tremendous progress in generative AI has made the generation and
> manipulation of synthetic data easier and faster than before. To this end,
> multiple use cases are benefitting from it. The negative aspect of this
> progress and wide adoption of generative AI is deepfakes. Audio/image/video
> of an individual(s) is manipulated using generative methods without
> permission from the individual(s). This can make them be shown saying or
> doing something, which they may not have done in real. These unethically
> manipulated videos, popularly known as deepfakes have wide repercussions
> and
> negative effects on society in the form of the deepfakes? potential in
> spreading disinformation and misinformation. Deepfakes unfortunately are
> used for trolling online as well. Authentication systems such as video KYC
> (Know Your Customer) are also not resilient as often face recognition and
> verification systems are deceived when high-quality deepfakes are used. To
> this end, it is important for platforms and systems to be able to identify
> if manipulation has been performed on a media. These systems, which detect
> and analyse the deepfakes are referred to as deepfakes detectors.
>
> The 1M-Deepfakes Detection Challenge comprises of two sub-tasks:
>
> a. Deepfake Detection ? Given an audio-visual sample containing a single
> subject, the task is to identify if the video is a deepfake or real.
>
> b. Deepfakes Temporal Localisation ? Given an audio-visual sample
> containing a
> single subject, the task is to find out the frames (time stamps) in which
> the manipulation is done. The assumption here is that from the perspective
> of spreading misinformation, editing a few vital parts of a video may be
> enough to change the meaning of the original video, and at the same time,
> the quality of the deepfake video will be closer to the original compared
> to
> a deepfake in which the entire original video is manipulated.
>
> Challenge Registration - https://deepfakes1m.github.io/
>
> Timeline
> Training and Validation Data - available now
> Test Data - mid-May
> Paper submission deadline - June 14
>
> Thanks,
> Abhinav Dhall
>
> --
> You received this message because you are subscribed to the Google Groups
> "IKDD News" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to ikdd-news+unsubscribe at googlegroups.com.
> To view this discussion on the web visithttps://
> groups.google.com/d/msgid/ikdd-news/6d12bbb1-5d65-4ad1-82fa-3c17c26
> dec7an%40googlegroups.com
> <http://groups.google.com/d/msgid/ikdd-news/6d12bbb1-5d65-4ad1-82fa-3c17c26dec7an%40googlegroups.com>
> .
>
> --
> Coral_current mailing list
> Coral_current at cs.umbc.edu
> https://lists.cs.umbc.edu/mailman/listinfo/coral_current
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.cs.umbc.edu/pipermail/coral_current/attachments/20240311/eb3d8983/attachment.html>


More information about the Coral_current mailing list