Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cleaning and reorganizing fuse core code on segmentation #318

Merged
merged 14 commits into from
May 31, 2023

Conversation

itaijj
Copy link
Collaborator

@itaijj itaijj commented May 29, 2023

No description provided.

@itaijj itaijj changed the title cleaning segmentation loss old code cleaning and reorganizing fuse segmentation code code May 29, 2023
@itaijj itaijj changed the title cleaning and reorganizing fuse segmentation code code cleaning and reorganizing fuse core code on segmentation May 29, 2023
@itaijj itaijj requested a review from egozi May 29, 2023 14:24


class WeightedFocalLoss(nn.Module):
"Non weighted version of Focal Loss"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

non-weighted or weighted?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waiting to decide if to keep it or not

self.alpha = torch.tensor([alpha, 1 - alpha])
self.gamma = gamma

def forward(self, inputs: Tensor, targets: Tensor) -> Tensor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why don't you get batch_dict instead of tensors?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waiting to decide if to keep it or not

self.alpha = torch.tensor([alpha, 1 - alpha])
self.gamma = gamma

def forward(self, inputs: Tensor, targets: Tensor) -> Tensor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add comments specifying which tensors do you expect, shape and content

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waiting to decide if to keep it or not

return result


class WeightedFocalLoss(nn.Module):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you move it? Are you using it?
If not, let's delete it. It's old code.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just think focal loss is not specific for segmentation, and we don't have it anywhere else in fuse so I moved it out of segmentation folder

@@ -44,7 +42,7 @@ def make_one_hot(input: Tensor, num_classes: int) -> Tensor:


class BinaryDiceLoss(nn.Module):
def __init__(self, power: int = 1, eps: float = 1.0, reduction: str = "mean"):
def __init__(self, eps: float = 1e-5, reduction: str = "mean"):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you change the defaults?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove power also from comments

Copy link
Collaborator Author

@itaijj itaijj May 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that the epsilon for smoothing should be small number and not 1.
I see in monai the number is 1e-5, do you have any other reference besides @egozi old code to keep it 1??

@itaijj itaijj requested a review from mosheraboh May 31, 2023 13:26
@itaijj itaijj merged commit b5e6145 into master May 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants