Tired of news that feels like noise?
Every day, 4.5 million readers turn to 1440 for their factual news fix. We sift through 100+ sources to bring you a complete summary of politics, global events, business, and culture — all in a brief 5-minute email. No spin. No slant. Just clarity.
This is Part 2 of a two-part series on 764. If you have not read Part 1 yet, start there first. It explains what 764 is and how they find your child.
Last week I told you that 764 is hunting children on Roblox, Minecraft, Discord, and Instagram. That the FBI has more than 450 open investigations. That a 13-year-old boy named Jay died because a predator in Germany spent months grooming him on a platform his parents thought was safe.
Today I want to be practical. Because knowing the threat exists is not enough. You need to know what it looks like and what to do about it.
Here is what the FBI and researchers who have studied 764 say parents should watch for.
Sudden and unexplained changes in behavior — withdrawal, irritability, anxiety, or depression with no obvious cause
Becoming secretive about online activity — closing screens or devices when you walk in the room
Unexplained cuts, scratches, bruises or marks on the body — particularly on arms or areas that can be covered by clothing. If your child's school nurse or doctor notices unexplained marks, that needs to be taken seriously. School nurses and hospital staff need to be educated about the signs of online exploitation and what to do when they see them. An unexplained cut on a child's arm is not always what it appears to be.
Receiving gifts, money, or items from people you do not know
Spending unusual amounts of time online — especially late at night
Referring to new online friends you have never met in real life
Becoming upset, scared, or withdrawn after being online
Unexplained harm to younger siblings or pets
Talking about or showing interest in self-harm, nihilism, or extremist ideas
Avoiding school, friends, or family activities they previously enjoyed
The FBI is clear on one important point: children who are targeted are not at fault. These predators are patient, sophisticated, and have created detailed guides on how to manipulate children. They specifically target children who are lonely, struggling, or going through a difficult time. The vulnerability that makes a child a target is the same vulnerability that makes it hard for them to ask for help.

If you suspect your child has been contacted by a 764 member or similar predator, here is what to do.
Stay calm. Your child needs to know they are safe with you. Panic or anger will shut down the conversation.
Do not delete anything. Screenshots, messages, usernames, and profile information are all evidence. Preserve it before reporting.
Report to the FBI at fbi.gov/how-we-can-help-you/victim-services/cenp
Report to the National Center for Missing and Exploited Children CyberTipline at report.cybertip.org
Contact your local law enforcement and tell them specifically you believe your child has been contacted by a member of the 764 network.
Contact the platform directly — Discord, Roblox, Instagram — to report the account. Save the username before reporting as accounts are sometimes removed quickly.
Get your child professional support. A trauma-informed therapist who works with children is important. Your child has been manipulated by someone who knew exactly what they were doing. That requires more than a conversation.
What every parent needs to know right now
Before we talk about what schools and lawmakers need to do, I want to say something directly to parents.
Children under 13 should not have a smartphone. If they do, it should only be able to make calls and receive calls — for emergencies. It should not have access to social media. Full stop.
If your child has a phone, never allow them to take it to bed with them. Predators are most active at night. That is when children are alone, unsupervised, and most vulnerable. The phone stays outside the bedroom. This is not negotiable.
Stranger danger is not just for the physical world. Everything we taught our children about not talking to strangers in person applies online too. If your child would not open the door to a stranger and invite them into their bedroom, they should not be having private conversations online with people they have never met in real life.
And this cannot be said clearly enough. Never send a photo of yourself without clothing on to anyone online. Not to a friend. Not to someone you think you are in a relationship with. Not to anyone. Ever. No legitimate relationship requires it. Anyone who asks is not who they say they are.
Now I want to talk about what schools need to do. Because this is not just a problem for parents to solve at home.
The same children being targeted by 764 online are sitting in classrooms every day. Teachers see them. Counselors see them. School resource officers interact with them. The behavioral warning signs of online exploitation look a lot like the behavioral warning signs of a student in crisis — withdrawal, self-harm, sudden changes in behavior. Schools already have threat assessment teams trained to recognize and respond to students who are struggling. Those same systems need to be applied to online exploitation.
School nurses in particular need to be educated about what they may be seeing. Unexplained cuts or marks on a student's body may not be what they appear to be. They may be the result of online coercion. A nurse who knows what to look for and what questions to ask could be the person who intervenes before something irreversible happens.
Florida already has a statewide behavioral threat assessment model. Every concerning behavior is supposed to be documented and reviewed. A student showing up with unexplained marks should trigger that process. A student who mentions online pressure or coercion should trigger that process. But only if the adults in the building are trained to ask the right questions and take the answers seriously.

Every school district in America should be doing three things about 764 right now. First, educating all staff (including nurses, coaches, and front office personnel) on what online exploitation looks like and how to recognize it. Second, including online safety in student education, not a one-time assembly but ongoing conversation built into the curriculum at every grade level. Third, building a clear protocol for what happens when a student discloses online exploitation so that every adult in the building knows who to call and what to do.
I am also calling on lawmakers at the state and federal level to act. The ECCHO Act introduced in the Senate in late 2025 would create penalties of up to life in prison for cases involving the death of a victim. That is the right direction. But legislation that punishes perpetrators after the fact is not enough. We need platforms to be held accountable for creating environments where this is possible. Discord banned 34,000 accounts in 2023 alone. A new account takes seconds to create. That is not a solution. That is theater.
I know this is heavy. I know it is frightening. I have spent eight years delivering frightening information to parents about what happens when schools fail to protect children. I do not do it to scare people. I do it because the alternative is staying silent while children are hurt.
You cannot protect your child from something you do not know exists. Now you know it exists. Share this with every parent you know.
RESOURCES FOR PARENTS
Report online exploitation to the FBI: fbi.gov/how-we-can-help-you/victim-services/cenp
National Center for Missing and Exploited Children CyberTipline: report.cybertip.org
988 Suicide and Crisis Lifeline: call or text 988 — available 24 hours a day, 7 days a week
Internet Crimes Against Children Task Force: icactaskforce.org

Safe Schools For Alex works to protect children inside the building and out. Share this article with every parent you know.
This topic is not going away. I will continue to cover 764 and online threats to children as long as they exist. Because Alex deserved a safe school. Every child deserves a safe world.
For Alex. And for all of them.
— Max Schachter
