Closed

220

participants

Topic provider

Taiwan Centers for Disease Control (Taiwan CDC) is the competent authority responsible for the prevention and control of communicable diseases in Taiwan. Our mission is to protect people from the threats of communicable diseases.In recent years, with dramatic increases in international travel and the number of foreign laborers, various communicable diseases have been imported to Taiwan.

Facing the threat of emerging and re-emerging communicable diseases, Taiwan CDC has built up comprehensive surveillance network, such as National Notifiable Disease Surveillance System, and formulated policies for disease prevention, quarantine, and the capabilities of laboratory testing and research. The linkage among different systems provides real-time information exchange and leads to prompt response for possible outbreaks. Taiwan CDC devotes every effort to further strengthening research capacity and recruiting experts to combat the threats of communicable diseases in a scientific way.

Introduction


Dengue fever is an acute infectious disease transmitted by mosquito. The peak time of dengue fever outbreak in Taiwan is usually at summertime. Mild clinical cases of dengue fever may present as symptoms such as fever, headaches, and myalgia while severe cases may have severe fluid leakage, hemorrhagic symptoms, shock, organ failure, coma and even death. The mortality rate can be as high as 20% or more if the patient does not receive proper treatment in time.

To effectively prevent dengue fever outbreak, cleaning up the breeding sites of the mosquitos is essential. Possible breeding sites for mosquitos include all containers that hold stagnant water, such as bottles, basins, buckets, cans, cups, bowls, tires, plastic bags, and etc.

Every year, the Taiwan Centers for Disease Control collaborate with local health department to examine the communities and to find uncleaned sites with those containers that may hold stagnant water, where may become mosquito breeding sites afterwards. However, the inspection takes tremendous manpower and time. This challenge provides labeled data for the various types of containers, and aims to build an object detection model for possible breeding sites. This way the inspectors can pinpoint the containers which hold stagnant water by digital camera images or live video, and thus improve the effectiveness of inspection and breeding site elimination.


Prize Information



The first place: 100,000 discount points of hicloud

The second place: 50,000 discount points of hicloud

The third place: 50,000 discount points of hicloud


Rewards were provided by Chunghwa Telecom

Note:

1. Hicloud points can redeem service charge. After the discount, the price will be charged at 30% off based on the list price automatically.

2. Chunghwa Telecom deserves the right to make changes to the terms and conditions herein.


Activity time

This event is conducted in National Standard Time (UTC+8). The schedule is as follows:

timeevent
2019/01/28Topic publishing announcement
2019/02/20Registration opens
2019/02/20Release the training datasets and Public test datasets. Public test dataset is open for testing. (Public Leaderboard)
2019/05/13 23:59:59Registration close
2019/05/14Release all the test datasets (including Public and Private datasets; Private Leaderboard will be announced when the topic ends)
2019/06/01 23:59:59The end of the competition(the result of the final upload is the benchmark)

Evaluation Criteria


The metrics of topic use mean Average Precision (mAP)[1] at intersection over union (IoU)[2] threshold is 0.5.

If the IoU of detection and ground truth is greater than 0.5, it is considered a True Positive, else it is considered a False Positive. Then we can get Precision.

The system evaluates its AP score for each small object, and then calculates the 13 type of water container object AP on average, and gets the mAP evaluation value. The participants will be ranked according to the criteria.

The system calculates mAP evaluation values using the COCO API[3]

Reference
[1] Average Precision (AP): https://en.wikipedia.org/wiki/Evaluation_measures_%%28information_retrieval%%29#Average_precision

[2] intersection over union (IoU):
https://en.wikipedia.org/wiki/Jaccard_index

[3] COCO API:
https://github.com/cocodataset/cocoapi

Rules

  • Competition results are evaluated by the final uploads; if the evaluated scores are the same, results that are uploaded earlier gets the higher score.
  • If cheating behavior occurs in the competition, your competition eligibility will be terminated and the vacancy will be filled by others according to their ranking.
  • The external legitimate publicly authorized datasets are allowed in the competition; however, to maintain the fairness of competition, the description and the original source of the external datasets used by participant must be posted in the discussion section for all other participants’ reference.
  • After uploading answers of the test data, there are two parts for scoring:
    • Before the competition time 2019/05/13 23:59:59 (UTC+8), only the answers of the test data in the first stage will be calculated by the system(Public test data test_pub_cdc.zip will be open for downloading at 2019/02/20)and be announced on the public leaderboard. This part of the data is 23.7% of the total test data.
    • After the competition time 2019/05/14 00:00:00 (UTC+8), all test data will be released, including private test data which is 76.3% of the total test data. Participants have to predict all test data and upload the answers. After the competition, the private leaderboard will be released; the competition results are evaluated by the latest uploads, and the rankings will be made accordingly.。